muralinarisetty/gpt-mcp-project
If you are the rightful owner of gpt-mcp-project and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The GPT MCP Project integrates OpenAI GPT Function Calling with real backend API calls via a locally hosted MCP server.
The GPT MCP Project is designed to bridge the gap between OpenAI's GPT Function Calling capabilities and real-world backend API interactions. By utilizing a locally hosted Model Context Protocol (MCP) server, this project allows for seamless integration of AI-driven tool calling with actual API backends. The architecture is clean and modular, making it easy to scale and incorporate multiple tools as needed. This setup is particularly beneficial for developers looking to leverage AI in a practical, scalable manner, ensuring that the AI can interact with real-world data and services effectively. The project is open-source and distributed under the MIT License, encouraging collaboration and further development by the community.
Features
- Tool calling with real API backend
- Clean modular structure
- Easy to scale with many tools
Usages
usage with local integration stdio
python mcp.run(transport='stdio') # Tools defined via @mcp.tool() decorator
usage with local integration ide plugin
{ "mcpServers": { "gpt-mcp": { "command": "python", "args": ["main.py"] } } }
usage with remote integration sse
python mcp.run(transport='sse', host="0.0.0.0", port=8000) # Specify SSE endpoint
usage with remote integration streamable http
yaml paths: /mcp: post: x-ms-agentic-protocol: mcp-streamable-1.0 # Copilot Studio integration
usage with platform ecosystem integration github
{"command": "docker", "args": ["run", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server"]}
usage with development frameworks fastmcp
python from mcp.server import FastMCP app = FastMCP('demo') @app.tool() async def query(): ...