Surfer12/claude-mcp-server-two
If you are the rightful owner of claude-mcp-server-two and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Claude MCP Server is a project that implements a server adhering to the Model Context Protocol (MCP), providing a standardized way to integrate AI tools and models.
π§ Meta-Optimized Hybrid Reasoning Framework
by Ryan Oates
License: Dual β AGPLv3 + Peer Production License (PPL)
Contact: ryan_oates@my.cuesta.edu
β¨ Purpose
This framework is part of an interdisciplinary vision to combine symbolic rigor, neural adaptability, and cognitive-aligned reasoning. It reflects years of integrated work at the intersection of computer science, biopsychology, and meta-epistemology.
It is not just software. It is a cognitive architecture, and its use is ethically bounded.
π Licensing Model
This repository is licensed under a hybrid model to balance openness, reciprocity, and authorship protection.
1. For Commons-Aligned Users (students, researchers, cooperatives)
Use it under the Peer Production License (PPL). You can:
- Study, adapt, and share it freely
- Use it in academic or nonprofit research
- Collaborate openly within the digital commons
2. For Public Use and Transparency
The AGPLv3 license guarantees:
- Network-based deployments must share modifications
- Derivatives must remain open source
- Attribution is mandatory
3. For Commercial or Extractive Use
You must not use this work if you are a:
- For-profit AI company
- Venture-backed foundation
- Closed-source platform ...unless you negotiate a commercial license directly.
π Attribution
This framework originated in:
Meta-Optimization in Hybrid Theorem Proving: Cognitive-Constrained Reasoning Framework, Ryan Oates (2025)
DOI: [Insert Zenodo/ArXiv link here]
Git commit hash of original release: a17c3f9...
This projectβs cognitive-theoretic roots come from studies in:
- Flow state modeling
- Symbolic logic systems
- Jungian epistemological structures
π€ Community Contributor Agreement
If you are a student, educator, or aligned research group and want to contribute:
- Fork this repo
- Acknowledge the author and original framework
- Use the βContributors.mdβ file to describe your adaptation
- Optional: Sign and return the to join the federated research network
π« What You May Not Do
- Integrate this system into closed-source LLM deployments
- Resell it or offer derivative products without explicit approval
- Strip author tags or alter authorship metadata
π¬ Contact
Want to collaborate, cite properly, or license commercially?
Reach out: ryan_oates@my.cuesta.edu
Claude MCP Server
This project implements a server adhering to the Model Context Protocol (MCP), providing a standardized way to integrate AI tools and models. It supports multiple AI providers (OpenAI, Anthropic, and Google) and offers a range of built-in tools for code analysis, web interaction, and more.
Features
- MCP Compliance: Designed to work with MCP clients, enabling seamless tool integration.
- Multi-Provider Support: Utilizes OpenAI, Anthropic, and Google's Gemini models. Configure the default provider and API keys via environment variables.
- Extensible Tooling: Includes a framework for easily adding and managing custom tools. Current tools include:
- Code Generation (
llm_code_generate
) - Web Requests (
web_request
) - Web Scraping (
web_scrape
) - Code Analysis (
code_analyze
) - Code Documentation (
code_document
) - Code Improvement Suggestions (
code_improve
)
- Code Generation (
- Node.js and Python Servers: Includes both Node.js (primary) and Python (FastAPI) server implementations.
- Containerization: Docker support for easy deployment and development.
- Testing: Integrated with Jest (JavaScript) and pytest (Python) for comprehensive testing.
- Linting and Formatting: Uses ESLint and Prettier to maintain code quality.
Project Structure
.
βββ config/ # Configuration files
β βββ .env # Environment variables (example provided)
β βββ dockerfile # Docker configuration
β βββ docker-compose.yaml
βββ src/ # Source code
β βββ api/ # FastAPI server (Python)
β βββ core/ # Core MCP logic (Python)
β βββ server/ # Node.js server implementations
β βββ tools/ # Individual tool implementations (JavaScript)
β βββ utils/ # Utility functions (JavaScript)
βββ tests/ # Test files
βββ data/ # Data directory (used by Python server)
βββ monitoring/ # Performance monitoring data
Setup and Installation
-
Clone the repository:
git clone <repository_url> cd claude-mcp-server
-
Environment Variables:
Create a
.env
file in theclaude-mcp-server
directory (and optionally inconfig/
) by copying the.env.example
file:cp .env.example .env
Then, fill in your API keys for OpenAI, Anthropic, and Google:
# .env NODE_ENV=development PORT=3000 DEFAULT_AI_PROVIDER=anthropic # or openai, google OPENAI_API_KEY=your-openai-key ANTHROPIC_API_KEY=your-anthropic-key GOOGLE_API_KEY=your-google-api-key
-
Node.js Server (Recommended):
-
Install Dependencies:
npm install
-
Run in Development Mode:
npm run dev # Uses simple-server.js # OR npm run dev:custom # Uses custom-server.js
The
--watch
flag automatically restarts the server on code changes. -
Run in Production Mode:
npm start # Uses simple-server.js
-
Run Tests:
npm test npm run test:watch # Watch mode npm run test:coverage # Generate coverage report
-
Linting and Formatting:
npm run lint npm run lint:fix # Automatically fix linting errors npm run format npm run format:check
-
-
Python Server (FastAPI):
- Install Dependencies (from claude-mcp-server directory):
pip install -r config/requirements.txt
- Run the Server:
npm run start:python
- Note: The Python server might be less actively maintained than the Node.js server.
- Install Dependencies (from claude-mcp-server directory):
Docker Usage
Docker is the recommended way to run the claude-mcp-server
, especially for production deployments. It provides a consistent and isolated environment.
-
Create a
.dockerignore
file (Recommended):Create a file named
.dockerignore
in theclaude-mcp-server
directory with the following content:node_modules .git .DS_Store npm-debug.log Dockerfile docker-compose.yaml .env tests/
-
Build the Docker Image:
npm run docker:build # or, equivalently: # docker-compose build
-
Run the Container (Development Mode - with Hot Reloading):
npm run docker:run:dev # or, equivalently: # docker-compose -f config/docker-compose.yaml up --build
This command uses the
config/docker-compose.yaml
file to:- Build the image (if it hasn't been built already or if changes are detected).
- Start a container named
claude-mcp-server
. - Map port 3000 on your host machine to port 3000 inside the container.
- Mount the
src
,data
, andconfig
directories as volumes. This means that any changes you make to these directories on your host machine will be immediately reflected inside the running container, allowing for hot-reloading during development.
-
Run the Container (Production Mode):
npm run docker:run # or, equivalently: # docker-compose up
This command starts the container without mounting the local directories as volumes. This is suitable for production because the container will use the code and configuration that were baked into the image during the build process.
-
Running the Python Server Inside the Docker Container:
Even though the Node.js server is the default entry point, you can still run the Python server within the running Docker container:
-
Get a Shell Inside the Container:
docker exec -it claude-mcp-server /bin/bash
This command opens an interactive bash shell inside the running
claude-mcp-server
container. -
Run the Python Server:
python src/api/server.py
The Python server will run on port 8000 inside the container. To access it from your host, either adjust
docker-compose.yaml
to expose port 8000 or usecurl
from within the container.
-
-
Stopping the Container:
docker-compose down
Usage
Once the server is running (either Node.js or Python), you can interact with it via MCP-compliant clients. The server exposes tools that can be invoked using a JSON-RPC 2.0 protocol. The specific tool names and parameters are defined within the src/tools
directory (for the Node.js server) and src/core/mcp_tools.py
(for the Python server). Refer to howTO.md
for available tools.
Contributing
See the main mcp-projects/README.md
for general contributing guidelines.
License
MIT License