MorvanZhou/customized_mcp
If you are the rightful owner of customized_mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project utilizes the MCP library with CLI support and integrates with OpenAI's API to provide a model context protocol server.
Customized MCP Project
This project leverages the mcp
library with CLI support and integrates with OpenAI's API.
Requirements
Make sure to install the required dependencies before running the project:
pip install -r requirements.txt
Usage
-
Configure your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-api-key"
-
Start the MCP server:
python server.py
-
Use the client to interact with the server:
python client.py
-
Alternatively, use the orchestrator to query the LLM and tools:
python main.py
Example
Querying the Weather Tool
Run the client and call the get_weather
tool:
python client.py
Example interaction:
You: List tools
Assistant: {
"tools": [
{
"name": "get_weather",
"description": "Get weather for a city",
"parameters": {
"city": {
"type": "string",
"description": "Name of the city"
}
}
}
]
}
You: Call get_weather with {"city": "Beijing"}
Assistant: 北京的天气是晴天
Dependencies
openai==1.70.0
mcp[cli]==1.6.0
License
This project is licensed under the MIT License.