MCP_Server

Lasimeri/MCP_Server

3.2

If you are the rightful owner of MCP_Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LM Studio MCP Proxy Server bridges Cursor IDE with LM Studio, enabling seamless integration of local AI models with advanced coding capabilities.

LM Studio MCP Proxy Server

A Model Context Protocol (MCP) server that bridges Cursor IDE with LM Studio, enabling seamless integration of local AI models with advanced coding capabilities. This server specifically fixes the "The model does not work with your current plan or API key" error in Cursor.

šŸš€ Features

  • šŸ”§ OpenAI API Compatibility: Full OpenAI API specification compliance
  • šŸ› ļø MCP Protocol Support: Complete MCP implementation with tools and resources
  • šŸŽÆ Custom Model Integration: Seamless integration with LM Studio models
  • šŸ”’ Error Resolution: Fixes Cursor's API key validation errors
  • šŸ“Š Health Monitoring: Built-in health checks and diagnostics
  • šŸ”„ Dual Mode: HTTP server and STDIO MCP server support

šŸŽÆ Problem Solved

This server specifically addresses the "The model does not work with your current plan or API key" error that occurs when trying to use custom models in Cursor IDE. It provides a proper OpenAI-compatible endpoint that Cursor can use without validation errors.

šŸ“‹ Prerequisites

  • Rust (latest stable version)
  • LM Studio running on port 1234
  • Cursor IDE (latest version)
  • DeepSeek R1 model loaded in LM Studio

šŸš€ Quick Start

Method 1: Custom OpenAI Endpoint (Recommended - Fixes API Key Error)

  1. Start the server:

    cargo run
    
  2. Configure Cursor:

    • Open Cursor Settings → Models
    • Scroll to "OpenAI API Key" section
    • Enable "Override OpenAI Base URL"
    • Set Base URL: http://127.0.0.1:3031/v1
    • Set API Key: sk-dummy-key-for-development
    • Click "Verify"
    • Disable all other models
    • Enable only: deepseek/deepseek-r1-0528-qwen3-8b
  3. Test in Cursor:

    • Open a new chat
    • Select the DeepSeek model
    • Send a message - no more API key errors!

Method 2: MCP Server Integration

  1. Start MCP server:

    cargo run -- --mcp
    
  2. Configure Cursor MCP:

    • Open Cursor Settings → Cursor Settings → MCP
    • Click "Add new MCP server"
    • Use the configuration from cursor-mcp-config.json

šŸ”§ Configuration

Server Configuration

The server supports two modes:

HTTP Server Mode (Default)
cargo run
  • Port: 3031
  • Health Check: http://127.0.0.1:3031/health
  • OpenAI Endpoint: http://127.0.0.1:3031/v1
MCP Server Mode
cargo run -- --mcp
  • Transport: STDIO
  • Protocol: JSON-RPC 2.0
  • Features: Tools, Resources, Prompts

Supported Models

The server supports these models by default:

  • deepseek/deepseek-r1-0528-qwen3-8b
  • gpt-3.5-turbo
  • gpt-4
  • gpt-4-turbo
  • gpt-4o
  • gpt-4o-mini

To add custom models, edit the SUPPORTED_MODELS constant in src/main.rs.

šŸ› ļø API Endpoints

Health Check

curl http://127.0.0.1:3031/health

List Models

curl http://127.0.0.1:3031/v1/models

Get Model Info

curl http://127.0.0.1:3031/v1/models/deepseek/deepseek-r1-0528-qwen3-8b

Chat Completion

curl -X POST http://127.0.0.1:3031/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer sk-dummy-key-for-development" \
  -d '{
    "model": "deepseek/deepseek-r1-0528-qwen3-8b",
    "messages": [
      {
        "role": "user",
        "content": "Hello, can you help me with coding?"
      }
    ]
  }'

šŸ” Troubleshooting

Common Issues

1. "The model does not work with your current plan or API key"

Solution: Use Method 1 (Custom OpenAI Endpoint) instead of MCP integration.

2. Port 3031 already in use

Solution:

# Find the process using port 3031
netstat -ano | findstr 3031

# Kill the process (replace PID with actual process ID)
taskkill /PID <PID> /F

# Or use a different port by modifying src/main.rs
3. LM Studio not responding

Solution:

# Check if LM Studio is running
curl http://127.0.0.1:1234/v1/models

# Restart LM Studio and ensure the model is loaded
4. Cursor not connecting

Solution:

  • Verify the base URL: http://127.0.0.1:3031/v1
  • Use any API key (e.g., sk-dummy-key-for-development)
  • Disable all other models in Cursor
  • Restart Cursor after configuration changes

Debug Mode

Enable debug logging:

RUST_LOG=debug cargo run

šŸ“ Project Structure

MCP_Server/
ā”œā”€ā”€ src/
│   └── main.rs              # Main server implementation
ā”œā”€ā”€ Cargo.toml               # Rust dependencies
ā”œā”€ā”€ README.md               # This file
ā”œā”€ā”€ CURSOR_SETUP_GUIDE.md   # Detailed setup instructions
ā”œā”€ā”€ cursor-mcp-config.json  # Configuration examples
ā”œā”€ā”€ mcp.json               # MCP server configuration
└── .gitignore             # Git ignore rules

šŸ”§ Development

Building

cargo build
cargo build --release

Testing

cargo test
cargo check

Running in Development

# With debug logging
RUST_LOG=debug cargo run

# With specific log level
RUST_LOG=info cargo run

šŸš€ Deployment

Local Development

cargo run

Production

cargo build --release
./target/release/MCP_Server

Docker (Future)

FROM rust:1.70 as builder
WORKDIR /app
COPY . .
RUN cargo build --release

FROM debian:bullseye-slim
COPY --from=builder /app/target/release/MCP_Server /usr/local/bin/
EXPOSE 3031
CMD ["MCP_Server"]

šŸ”’ Security Considerations

Development Mode

  • Accepts any API key for convenience
  • No authentication required
  • Suitable for local development only

Production Mode

  • Implement proper API key validation
  • Add rate limiting
  • Use HTTPS with certificates
  • Configure proper CORS policies
  • Add monitoring and logging

šŸ¤ Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

šŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

šŸ™ Acknowledgments

  • LM Studio for providing the local AI model infrastructure
  • Cursor IDE for the excellent development environment
  • Model Context Protocol for the standardization efforts
  • OpenAI for the API specification that made this possible

šŸ“ž Support

If you encounter issues:

  1. Check the troubleshooting section above
  2. Review the CURSOR_SETUP_GUIDE.md for detailed instructions
  3. Check the server logs for error messages
  4. Verify LM Studio is running and accessible
  5. Ensure the model is properly loaded in LM Studio

šŸ”„ Changelog

v1.0.0

  • āœ… Initial release
  • āœ… OpenAI API compatibility
  • āœ… MCP protocol support
  • āœ… Custom model integration
  • āœ… API key error resolution
  • āœ… Health monitoring
  • āœ… Comprehensive documentation

Happy coding with your local AI models! šŸš€