skills-mcp-server

odewahn/skills-mcp-server

3.2

If you are the rightful owner of skills-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The Skills MCP Server is a Model Context Protocol server designed to manage AI skills using LLM-generated navigation indexes.

Tools
11
Resources
0
Prompts
0

Skills MCP Server

A Model Context Protocol (MCP) server for managing AI skills with LLM-generated navigation indexes.

Features

  • 🎯 Multi-skill support - Manage multiple skill packages
  • 🧠 LLM-generated indexes - Smart navigation maps created by any LLM via LiteLLM
  • Progressive disclosure - Load only what you need (90% token savings)
  • 🔧 Unix-based search - Fast grep/sed/awk tools (5-20ms latency)
  • 🌐 Model-agnostic - Works with Claude, GPT-4, Llama, or any LLM
  • 📦 Zero ML dependencies - No embeddings, no vector DBs needed

Quick Start with uvx

# Run directly from GitHub (no installation!)
uvx --from git+https://github.com/odewahn/skills-mcp-server skills-server

# Or install permanently
uv tool install git+https://github.com/odewahn/skills-mcp-server
skills-server

Installation

Option 1: uvx (Recommended)

# Run directly
uvx --from git+https://github.com/odewahn/skills-mcp-server skills-server

# Install as tool
uv tool install git+https://github.com/odewahn/skills-mcp-server

Option 2: pip

pip install git+https://github.com/odewahn/skills-mcp-server
skills-server

Option 3: From Source

git clone https://github.com/odewahn/skills-mcp-server
cd skills-mcp-server
uv pip install -e .
skills-server

Configuration

Set your API keys as environment variables:

export ANTHROPIC_API_KEY="your-key-here"
export OPENAI_API_KEY="your-key-here"
# etc.

Optional environment variables:

export SKILLS_DIR="./my-skills"  # Custom skills directory

Usage

Start the Server

skills-server

Use from Python

import asyncio
from mcp import ClientSession, StdioServerParameters

async def demo():
    server = StdioServerParameters(
        command="skills-server"
    )
    
    async with ClientSession(server) as session:
        await session.initialize()
        
        # Install a skill with LLM-generated index
        result = await session.call_tool("install_skill", {
            "skill_file_path": "./python-learning.skill",
            "generate_index": True,
            "model": "claude-3-5-sonnet-20241022",
            "provider": "anthropic"
        })
        print(f"Installed: {result}")
        
        # Get the navigation index
        index = await session.call_tool("get_skill_index", {
            "skill_name": "python-learning"
        })
        print(f"Index: {index[:500]}...")
        
        # Fetch specific content based on index
        content = await session.call_tool("fetch_section", {
            "skill_name": "python-learning",
            "file_name": "references/topic_catalog.md",
            "section_title": "Lists"
        })
        print(f"Content: {content}")

asyncio.run(demo())

Available Tools

The server provides 11 MCP tools:

Installation & Management

  • install_skill - Install .skill file and generate index
  • list_skills - Show all installed skills
  • regenerate_index - Regenerate index with different model

Discovery

  • search_skills - Find skills matching query
  • get_skill_index - Load navigation index (auto-generate if missing)
  • search_index - Search within the navigation index

Content Access

  • load_skill - Get SKILL.md instructions
  • list_sections - Table of contents
  • fetch_section - Extract specific section
  • search_content - Search with context
  • grep_skill - Raw grep search
  • find_code_blocks - Find code examples

How It Works

1. Install Skill

await install_skill("./python-learning.skill", model="claude-3-5-sonnet-20241022")

The server:

  • Extracts the .skill file (zip)
  • Uses Unix tools (grep/sed) to analyze structure
  • Calls LLM to generate smart INDEX.md
  • Caches index for future use

2. Agent Uses Index

# Agent loads small index first (500 tokens)
index = await get_skill_index("python-learning")

# Index tells agent: "For lists, see topic_catalog.md lines 100-155"

# Agent fetches only what it needs (250 tokens)
content = await fetch_section("python-learning", "topic_catalog.md", "Lists")

# Total: 750 tokens vs. 7,500 tokens without index (90% savings!)

3. Progressive Disclosure

Without index:
  Load everything → 30KB → 7,500 tokens → 95% irrelevant

With index:
  INDEX.md → 2KB → 500 tokens
  Specific section → 1KB → 250 tokens
  Total → 3KB → 750 tokens → 100% relevant

Example Index

The LLM generates navigation maps like this:

# Python Learning Skill - Navigation Index

## Skill Overview
Interactive Python education for beginners to intermediate level...

## Topic Index
### Lists
- Concepts: references/topic_catalog.md, section "Lists", lines 100-130
- Exercises: references/topic_catalog.md, lines 131-145
- Challenges: references/topic_catalog.md, lines 146-155

### Dictionaries  
- Concepts: references/topic_catalog.md, section "Dictionaries", lines 160-185
...

## Navigation Hints
For teaching: Load SKILL.md → fetch relevant topic section
For practice: Get challenge ideas from topic_catalog.md
...

Model Support

Works with any LLM via LiteLLM:

# Anthropic Claude (best for complex skills)
model="claude-3-5-sonnet-20241022", provider="anthropic"

# OpenAI GPT (fast and effective)
model="gpt-4", provider="openai"

# Local with Ollama (free!)
model="llama2", provider="ollama"

# 50+ providers supported

Cost Analysis

Index Generation (one-time per skill)

  • Claude Sonnet: ~$0.015/skill
  • GPT-4: ~$0.030/skill
  • GPT-3.5: ~$0.003/skill
  • Ollama: $0/skill

Runtime (every query)

  • Index cached (prompt caching → $0)
  • Only pay for fetched sections
  • 90% token savings

Requirements

  • Python 3.10+
  • Unix-like environment (macOS, Linux, WSL)
  • API keys for LLM providers (Anthropic, OpenAI, etc.)

Required:

  • ripgrep (rg) for search and indexing — no fallback

Architecture

Your Agent (any LLM)
  ↓ MCP Protocol
Skills Server
  ↓ LiteLLM (index generation)
  ↓ Unix tools (search/fetch)
Skills Directory
  ├── skill-1/
  │   ├── INDEX.md (LLM-generated)
  │   ├── SKILL.md
  │   └── references/
  └── skill-2/
      ├── INDEX.md
      ├── SKILL.md
      └── references/

Development

# Clone
git clone https://github.com/odewahn/skills-mcp-server
cd skills-mcp-server

# Install dependencies with uv
uv sync

# Run locally
uv run skills-server

# Run tests
uv run pytest

# Format code
uv run black src/
uv run ruff check src/

License

MIT

Contributing

Contributions welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Submit a pull request

Links

Acknowledgments

Built with:

  • FastMCP - MCP server framework
  • LiteLLM - Unified LLM API
  • Unix tools (grep, sed, awk, ripgrep)

Example Skills

This server is designed to work with Anthropic-style .skill files. Example skills:

  • python-learning.skill - Interactive Python teaching
  • python-assessment.skill - Python proficiency testing

See the examples/ directory.