odewahn/skills-mcp-server
If you are the rightful owner of skills-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Skills MCP Server is a Model Context Protocol server designed to manage AI skills using LLM-generated navigation indexes.
Skills MCP Server
A Model Context Protocol (MCP) server for managing AI skills with LLM-generated navigation indexes.
Features
- 🎯 Multi-skill support - Manage multiple skill packages
- 🧠 LLM-generated indexes - Smart navigation maps created by any LLM via LiteLLM
- ⚡ Progressive disclosure - Load only what you need (90% token savings)
- 🔧 Unix-based search - Fast grep/sed/awk tools (5-20ms latency)
- 🌐 Model-agnostic - Works with Claude, GPT-4, Llama, or any LLM
- 📦 Zero ML dependencies - No embeddings, no vector DBs needed
Quick Start with uvx
# Run directly from GitHub (no installation!)
uvx --from git+https://github.com/odewahn/skills-mcp-server skills-server
# Or install permanently
uv tool install git+https://github.com/odewahn/skills-mcp-server
skills-server
Installation
Option 1: uvx (Recommended)
# Run directly
uvx --from git+https://github.com/odewahn/skills-mcp-server skills-server
# Install as tool
uv tool install git+https://github.com/odewahn/skills-mcp-server
Option 2: pip
pip install git+https://github.com/odewahn/skills-mcp-server
skills-server
Option 3: From Source
git clone https://github.com/odewahn/skills-mcp-server
cd skills-mcp-server
uv pip install -e .
skills-server
Configuration
Set your API keys as environment variables:
export ANTHROPIC_API_KEY="your-key-here"
export OPENAI_API_KEY="your-key-here"
# etc.
Optional environment variables:
export SKILLS_DIR="./my-skills" # Custom skills directory
Usage
Start the Server
skills-server
Use from Python
import asyncio
from mcp import ClientSession, StdioServerParameters
async def demo():
server = StdioServerParameters(
command="skills-server"
)
async with ClientSession(server) as session:
await session.initialize()
# Install a skill with LLM-generated index
result = await session.call_tool("install_skill", {
"skill_file_path": "./python-learning.skill",
"generate_index": True,
"model": "claude-3-5-sonnet-20241022",
"provider": "anthropic"
})
print(f"Installed: {result}")
# Get the navigation index
index = await session.call_tool("get_skill_index", {
"skill_name": "python-learning"
})
print(f"Index: {index[:500]}...")
# Fetch specific content based on index
content = await session.call_tool("fetch_section", {
"skill_name": "python-learning",
"file_name": "references/topic_catalog.md",
"section_title": "Lists"
})
print(f"Content: {content}")
asyncio.run(demo())
Available Tools
The server provides 11 MCP tools:
Installation & Management
install_skill- Install .skill file and generate indexlist_skills- Show all installed skillsregenerate_index- Regenerate index with different model
Discovery
search_skills- Find skills matching queryget_skill_index- Load navigation index (auto-generate if missing)search_index- Search within the navigation index
Content Access
load_skill- Get SKILL.md instructionslist_sections- Table of contentsfetch_section- Extract specific sectionsearch_content- Search with contextgrep_skill- Raw grep searchfind_code_blocks- Find code examples
How It Works
1. Install Skill
await install_skill("./python-learning.skill", model="claude-3-5-sonnet-20241022")
The server:
- Extracts the .skill file (zip)
- Uses Unix tools (grep/sed) to analyze structure
- Calls LLM to generate smart INDEX.md
- Caches index for future use
2. Agent Uses Index
# Agent loads small index first (500 tokens)
index = await get_skill_index("python-learning")
# Index tells agent: "For lists, see topic_catalog.md lines 100-155"
# Agent fetches only what it needs (250 tokens)
content = await fetch_section("python-learning", "topic_catalog.md", "Lists")
# Total: 750 tokens vs. 7,500 tokens without index (90% savings!)
3. Progressive Disclosure
Without index:
Load everything → 30KB → 7,500 tokens → 95% irrelevant
With index:
INDEX.md → 2KB → 500 tokens
Specific section → 1KB → 250 tokens
Total → 3KB → 750 tokens → 100% relevant
Example Index
The LLM generates navigation maps like this:
# Python Learning Skill - Navigation Index
## Skill Overview
Interactive Python education for beginners to intermediate level...
## Topic Index
### Lists
- Concepts: references/topic_catalog.md, section "Lists", lines 100-130
- Exercises: references/topic_catalog.md, lines 131-145
- Challenges: references/topic_catalog.md, lines 146-155
### Dictionaries
- Concepts: references/topic_catalog.md, section "Dictionaries", lines 160-185
...
## Navigation Hints
For teaching: Load SKILL.md → fetch relevant topic section
For practice: Get challenge ideas from topic_catalog.md
...
Model Support
Works with any LLM via LiteLLM:
# Anthropic Claude (best for complex skills)
model="claude-3-5-sonnet-20241022", provider="anthropic"
# OpenAI GPT (fast and effective)
model="gpt-4", provider="openai"
# Local with Ollama (free!)
model="llama2", provider="ollama"
# 50+ providers supported
Cost Analysis
Index Generation (one-time per skill)
- Claude Sonnet: ~$0.015/skill
- GPT-4: ~$0.030/skill
- GPT-3.5: ~$0.003/skill
- Ollama: $0/skill
Runtime (every query)
- Index cached (prompt caching → $0)
- Only pay for fetched sections
- 90% token savings
Requirements
- Python 3.10+
- Unix-like environment (macOS, Linux, WSL)
- API keys for LLM providers (Anthropic, OpenAI, etc.)
Required:
ripgrep(rg) for search and indexing — no fallback
Architecture
Your Agent (any LLM)
↓ MCP Protocol
Skills Server
↓ LiteLLM (index generation)
↓ Unix tools (search/fetch)
Skills Directory
├── skill-1/
│ ├── INDEX.md (LLM-generated)
│ ├── SKILL.md
│ └── references/
└── skill-2/
├── INDEX.md
├── SKILL.md
└── references/
Development
# Clone
git clone https://github.com/odewahn/skills-mcp-server
cd skills-mcp-server
# Install dependencies with uv
uv sync
# Run locally
uv run skills-server
# Run tests
uv run pytest
# Format code
uv run black src/
uv run ruff check src/
License
MIT
Contributing
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
Links
- Documentation: Full docs
- Issues: GitHub Issues
- MCP Specification: Model Context Protocol
- LiteLLM: LiteLLM Docs
Acknowledgments
Built with:
Example Skills
This server is designed to work with Anthropic-style .skill files. Example skills:
python-learning.skill- Interactive Python teachingpython-assessment.skill- Python proficiency testing
See the examples/ directory.