sujathaynweh/sampleMCPToolServer
If you are the rightful owner of sampleMCPToolServer and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This repository contains a simple Flask-based server implementing an MCP (Model Context Protocol) tool, which exposes a `/run-tool` API endpoint and serves its OpenAPI specification.
Sample MCP Tool Server
This repository contains a simple Flask-based server implementing an MCP (Model Context Protocol) tool, which exposes a /run-tool
API endpoint and serves its OpenAPI specification. This tool can be connected to Open WebUI and used with a locally hosted LLaMA 3 model.
Features
- Flask API with a single POST endpoint:
/run-tool
- Serves an OpenAPI spec at
/.well-known/openapi.json
- CORS enabled
- Ready to integrate with Open WebUI
Prerequisites
- Python 3.8+
- Open WebUI installed and running
- LLaMA 3 model downloaded and configured
Installation
-
Clone this repository:
git clone https://github.com/your-username/sampleMCPToolServer.git cd sampleMCPToolServer
-
Create and activate a virtual environment (optional but recommended):
python3 -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install flask flask-cors
Folder Structure
sampleMCPToolServer/
ā
āāā app.py
āāā static/
āāā .well-known/
āāā openapi.json
Running the Server
python app.py
This starts the server on http://localhost:5050
.
- OpenAPI spec is available at:
http://localhost:5050/.well-known/openapi.json
- Tool endpoint:
http://localhost:5050/run-tool
Connecting to Open WebUI
Downloading a Model in Open WebUI
-
Go to the Models tab in Open WebUI.
-
Search for and download a supported LLM model by specifying the model name (e.g.,
llama3
). -
Once the model is downloaded and loaded, it can be used to call the tool.
-
Install and start Open WebUI:
Follow the instructions at https://github.com/open-webui/open-webui
-
Add your tool to Open WebUI:
- Navigate to Open WebUI
- Go to Settings > Tools > Add Tool
- Enter the URL to your OpenAPI JSON:
http://localhost:5050/.well-known/openapi.json
- Save and verify that the tool is successfully added.
-
Use the tool in chats:
Type messages that reference or trigger the tool (e.g., "Use the test tool with name John").
Example Request
Endpoint:
POST /run-tool
Request Body:
{
"name": "Alice"
}
Response:
{
"message": "Hello, Alice! The tool was successfully called."
}
License
MIT License
Author
- sujathay.nweh