Traia-IO/nikola-test-2-mcp-server
If you are the rightful owner of nikola-test-2-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Nikola Test 2 MCP Server provides a standardized interface for AI agents and LLMs to interact with the Nikola Test 2 API using the Model Context Protocol.
Nikola Test 2 MCP Server
This is an MCP (Model Context Protocol) server that provides access to the Nikola Test 2 API. It enables AI agents and LLMs to interact with Nikola Test 2 through standardized tools.
Features
- 🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
- 🌐 Full API Access: Provides tools for interacting with Nikola Test 2 endpoints
- 🐳 Docker Support: Easy deployment with Docker and Docker Compose
- ⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
- Nikola Test 2 Website: https://petstore.swagger.io/
- API Documentation:
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)get_api_info: Get information about the API service and authentication status
Note: Replace example_tool with actual Nikola Test 2 API tools based on the documentation.
Installation
Using Docker (Recommended)
-
Clone this repository:
git clone https://github.com/Traia-IO/nikola-test-2-mcp-server.git cd nikola-test-2-mcp-server -
Run with Docker:
./run_local_docker.sh
Using Docker Compose
- Create a
.envfile with your configuration:
PORT=8000
2. Start the server:
```bash
docker-compose up
Manual Installation
-
Install dependencies using
uv:uv pip install -e . -
Run the server:
uv run python -m server
## Usage
### Health Check
Test if the server is running:
```bash
python mcp_health_check.py
Using with CrewAI
from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter
# Connect to the MCP server
with create_mcp_adapter(
url="http://localhost:8000/mcp/"
) as tools:
# Use the tools
for tool in tools:
print(f"Available tool: {tool.name}")
# Example usage
result = await tool.example_tool(query="test")
print(result)
Development
Testing the Server
- Start the server locally
- Run the health check:
python mcp_health_check.py - Test individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
- Create API client functions for Nikola Test 2 endpoints
- Add
@mcp.tool()decorated functions - Update this README with the new tools
- Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
{
"github_url": "https://github.com/Traia-IO/nikola-test-2-mcp-server",
"mcp_server": {
"name": "nikola-test-2-mcp",
"description": "Test description 2",
"server_type": "streamable-http",
"capabilities": [
// List all implemented tool names here
"example_tool",
"get_api_info"
]
},
"deployment_method": "cloud_run",
"gcp_project_id": "traia-mcp-servers",
"gcp_region": "us-central1",
"tags": ["nikola test 2", "api"],
"ref": "main"
}
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
- Build a container from the Dockerfile
- Deploy to Cloud Run with the specified configuration
- Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)
Troubleshooting
- Server not starting: Check Docker logs with
docker logs <container-id> - Connection errors: Ensure the server is running on the expected port3. Tool errors: Check the server logs for detailed error messages
Contributing
- Fork the repository
- Create a feature branch
- Implement new tools or improvements
- Update the README and deployment_params.json
- Submit a pull request