Acoste616/MCP-SERVER
If you are the rightful owner of MCP-SERVER and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The MCP Server is a local Model Context Protocol server designed to store and manage project context, enabling multiple AI models to share the same information through a REST and WebSocket API.
MCP Server
A local Model Context Protocol server that stores project context (bookmarks, files, notes) and exposes a REST + WebSocket API so multiple AI models (OpenAI GPT-4o, Anthropic Claude 3, local LM Studio) can share the same information.
Features
- Store and manage project context (files, bookmarks, notes)
- Integrate with multiple AI model providers:
- OpenAI (GPT-4o, GPT-4o-mini)
- Anthropic Claude
- LM Studio (local models)
- WebSocket API for real-time updates
- Token usage tracking and budget management
- Async/await architecture with FastAPI
Project Structure
mcp-server/
āāā backend/ # Python backend code
ā āāā app/ # Application code
ā ā āāā api/ # API endpoints
ā ā āāā core/ # Core functionality
ā ā āāā models/ # SQLModel definitions
ā ā āāā services/ # Business logic services
ā āāā tests/ # Test suite
ā āāā alembic/ # Database migrations
āāā docker/ # Docker configuration
āāā docs/ # Documentation
Prerequisites
- Python 3.12+
- Poetry for dependency management
- Optional: LM Studio for local models
Installation
-
Clone the repository:
git clone https://github.com/yourusername/mcp-server.git cd mcp-server
-
Install dependencies with Poetry:
poetry install
-
Create a
.env
file with your configuration:OPENAI_API_KEY=your_openai_key ANTHROPIC_API_KEY=your_anthropic_key LM_STUDIO_BASE_URL=http://localhost:1234/v1
Running the Server
Start the development server:
cd backend
poetry run uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
The API will be available at http://localhost:8000
API Documentation
Once the server is running, you can access the interactive API documentation at:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
License
MIT
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.