athena-mcp-server

Kingdamienjl/athena-mcp-server

3.1

If you are the rightful owner of athena-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Athena MCP Server is a comprehensive Model Context Protocol server that integrates with OpenAI GPT models to provide AI-powered tools and system utilities.

Tools
4
Resources
0
Prompts
0

Athena MCP Server

A comprehensive Model Context Protocol (MCP) server that provides AI-powered tools and system utilities. This server integrates with OpenAI GPT models to deliver intelligent responses and analysis capabilities.

Features

Core AI Tools (OpenAI GPT-powered)

  • ask_athena: Intelligent AI assistant for general queries and problem-solving
  • analyze_code: Advanced code analysis with optimization suggestions
  • generate_code: Intelligent code generation based on requirements
  • text_summarize: AI-powered text summarization with customizable length and style
  • translate_text: Multi-language translation using OpenAI models
  • image_generate: DALL-E powered image generation

System & Development Tools

  • get_system_stats: Real-time system monitoring (CPU, memory, disk usage)
  • file_operations: Comprehensive file and directory management
  • process_monitor: System process monitoring and management
  • docker_manage: Docker container and image management
  • network_tools: Network diagnostics (ping, port scan, DNS lookup, traceroute)

Web & API Tools

  • web_request: HTTP client for API testing and web scraping
  • weather_info: Real-time weather information using OpenWeatherMap API
  • github_operations: GitHub repository management and code search

📁 Project Structure

Athena MCP/
├── app.js                 # Backend entry point
├── mcp-server.js          # MCP server for Trae integration
├── mcp-config.json        # MCP configuration file
├── package.json           # Backend dependencies
├── .env                   # Environment variables
├── tools/                 # Custom tools directory
│   └── get_cpu_stats.js   # CPU statistics tool
├── frontend/              # React frontend
│   ├── package.json       # Frontend dependencies
│   ├── public/
│   └── src/
│       ├── App.js         # Main React component
│       ├── App.css        # Component styles
│       ├── index.js       # React entry point
│       └── index.css      # Global styles
├── docker-compose.yml     # Docker orchestration
├── Dockerfile.backend     # Backend Docker image
└── README.md             # This file

🔌 MCP Integration with Trae

Quick Setup for Trae

  1. Install dependencies:

    npm install
    
  2. Start MCP server:

    npm run mcp
    
  3. Add to Trae configuration: Add this to your Trae MCP configuration:

    {
      "mcpServers": {
        "athena": {
          "command": "node",
          "args": ["mcp-server.js"],
          "cwd": "d:\\Projects\\Athena MCP"
        }
      }
    }
    

Available MCP Tools

Tool NameDescription
ask_athenaAsk Athena AI assistant questions and get intelligent responses powered by OpenAI GPT
get_system_statsGet detailed system CPU, memory, and performance statistics
analyze_codeAnalyze code snippets with AI-powered review, explain, optimize, or debug modes
generate_codeGenerate code based on requirements and specifications using OpenAI

MCP Tool Examples

Ask Athena:

{
  "name": "ask_athena",
  "arguments": {
    "prompt": "How do I optimize React performance?",
    "context": "Working on a large React application with performance issues"
  }
}

Get System Stats:

{
  "name": "get_system_stats",
  "arguments": {
    "detailed": true
  }
}

Analyze Code:

{
  "name": "analyze_code",
  "arguments": {
    "code": "function fibonacci(n) { return n <= 1 ? n : fibonacci(n-1) + fibonacci(n-2); }",
    "language": "javascript",
    "analysis_type": "optimize"
  }
}

🛠️ Setup & Installation

Prerequisites

  • Node.js 18+ and npm
  • (Optional) Docker and Docker Compose

Method 1: Local Development

  1. Clone and setup backend:

    cd "d:\Projects\Athena MCP"
    npm install
    
  2. Setup frontend:

    cd frontend
    npm install
    
  3. Configure environment:

    • Edit .env file and add your OpenAI API key:
    PORT=4000
    OPENAI_API_KEY=your_actual_api_key_here
    
  4. Run the applications:

    Terminal 1 (Backend):

    npm start
    # Backend runs on http://localhost:4000
    

    Terminal 2 (Frontend):

    cd frontend
    npm start
    # Frontend runs on http://localhost:3000
    

Method 2: Docker Compose

  1. Set environment variables:

    # Create .env file with your API key
    echo "OPENAI_API_KEY=your_actual_api_key_here" > .env
    
  2. Run with Docker:

    docker-compose up --build
    

    This will start:

🔌 API Endpoints

Backend API (Port 4000)

MethodEndpointDescription
GET/API information and available endpoints
POST/askSend a prompt to Athena AI
GET/cpuGet system CPU and memory statistics
GET/healthHealth check endpoint

Example API Usage

Ask Athena a question:

curl -X POST http://localhost:4000/ask \
  -H "Content-Type: application/json" \
  -d '{"prompt": "What is artificial intelligence?"}'

Get CPU statistics:

curl http://localhost:4000/cpu

🎨 Frontend Features

  • Modern UI: Clean, responsive design with gradient backgrounds
  • Real-time Interaction: Instant feedback and loading states
  • Error Handling: User-friendly error messages
  • Mobile Responsive: Works on all device sizes
  • System Monitoring: Visual display of CPU and memory stats

🔧 Development

Adding New Tools

  1. Create a new file in the tools/ directory:

    // tools/my_new_tool.js
    function myNewTool() {
      // Your tool logic here
      return { result: "Tool output" };
    }
    
    module.exports = { myNewTool };
    
  2. Import and use in app.js:

    const { myNewTool } = require('./tools/my_new_tool');
    
    app.get('/my-endpoint', (req, res) => {
      const result = myNewTool();
      res.json(result);
    });
    

Environment Variables

VariableDescriptionDefault
PORTBackend server port4000
OPENAI_API_KEYOpenAI API key for AI featuresRequired
NODE_ENVEnvironment modedevelopment

🐳 Docker Commands

# Build and run
docker-compose up --build

# Run in background
docker-compose up -d

# Stop services
docker-compose down

# View logs
docker-compose logs -f

# Rebuild specific service
docker-compose build backend
docker-compose build frontend

🚀 Production Deployment

  1. Set production environment variables
  2. Build optimized frontend:
    cd frontend
    npm run build
    
  3. Use process manager like PM2:
    npm install -g pm2
    pm2 start app.js --name athena-backend
    

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

📝 License

MIT License - feel free to use this project for your own purposes.

🆘 Troubleshooting

Backend won't start:

  • Check if port 4000 is available
  • Verify Node.js version (18+)
  • Check .env file configuration

Frontend can't connect to backend:

  • Ensure backend is running on port 4000
  • Check CORS configuration
  • Verify API_BASE_URL in frontend

Docker issues:

  • Ensure Docker is running
  • Check port conflicts
  • Verify environment variables in docker-compose.yml

Happy coding! 🎉