mcp-server

Shubhlakhia007/mcp-server

3.1

If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

A Model Context Protocol (MCP) server that integrates with LMStudio and provides OpenAI-compatible endpoints, featuring natural language processing for nmap commands and scan result analysis.

MCP Server with nmap Integration

A Model Context Protocol (MCP) server that integrates with LMStudio and provides OpenAI-compatible endpoints. This server includes natural language processing for nmap commands and scan result analysis.

Features

  • ✨ Natural Language → nmap Converter (NL2nmap)
  • 📊 Scan Result Summarizer & Recommendations
  • 🤖 Interactive Elicitation Flow
  • 📅 Scheduled/Queued Scans
  • 🔒 LLM Safety & Sandboxing
  • 🧪 Comprehensive Testing

Quick Start

  1. Clone the repository:
git clone https://github.com/YOUR_USERNAME/mcp-server.git
cd mcp-server
  1. Create a virtual environment and install dependencies:
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
  1. Configure the environment: Create a .env file with:
LLM_API_KEY=sk-no-key-required
LLM_BASE_URL=http://127.0.0.1:1234/v1
MODEL_NAME=qwen2.5:3b-instruct-q4_K_M
MCP_SERVER_PORT=8001
  1. Start the server:
python -m uvicorn app.main:app --host 0.0.0.0 --port 8001 --reload

API Endpoints

Chat Completions

  • POST /v1/chat/completions
    • OpenAI-compatible chat completions endpoint
    • Supports both streaming and non-streaming responses

Health Check

  • GET /v1/health
    • Returns server status and configuration

Example Usage

# Basic chat completion request
curl -X POST http://localhost:8001/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "qwen2.5:3b-instruct-q4_K_M",
    "messages": [{"role": "user", "content": "Say hello!"}],
    "stream": false
  }'

Development

The project follows a modular architecture:

  • app/: Main application package
    • config/: Configuration settings
    • routes/: API endpoints
    • schemas/: Data models
    • services/: Business logic
    • utils/: Helper functions

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.