mtib/flash-cards-mcp
If you are the rightful owner of flash-cards-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
FlashCardsMCP is a dockerized Python Model Context Protocol (MCP) server designed for managing flash card projects using OpenAI embeddings and SQLite for semantic search and storage.
get_all_projects
List all projects
add_project
Create a new project (returns full project dict)
search_project_by_name
Semantic search for a project (returns full project dict)
get_random_card_by_project
Get a random card from a project
add_card
Add a card (returns full card dict)
get_all_cards_by_project
List all cards in a project
search_cards_by_embedding
Semantic search for cards in a project
global_search_cards_by_embedding
Semantic search for cards across all projects
get_card_by_id
Retrieve a card by its id
FlashCardsMCP
This is a dockerized Python Model Context Protocol (MCP) server for managing flash card projects. It uses OpenAI embeddings and SQLite for semantic search and storage.
Features
- List all project names and ids
- Semantic search for project by name (using OpenAI embeddings)
- Get random flash card by project id
- Add flash card to project (with question, answer, optional hint, optional description)
- List all flash cards by project
- Semantic search for flash cards by query (using OpenAI embeddings)
- Global semantic search for cards across all projects
- Retrieve a card by its id
- All API/tool responses include a
type
field:project
orcard
- No binary embedding data is ever returned in API responses
API/Tool Design
- All tools raise
ValueError
for not found or empty results - Project and card creation tools return the full object, not just the id
- See
.github/copilot-instructions.md
for code generation rules
Getting Started
- Install dependencies:
pip install -r requirements.txt
- Run the server:
python main.py
- Run with Docker:
docker build -t flash-card-mcp . # Run with database persistence (recommended): docker run -v $(pwd)/storage:/app/storage/database.db flash-card-mcp
Environment Variables
- OPENAI_API_KEY: Required. Set this environment variable to your OpenAI API key to enable embedding generation. Example:
You must set this variable before running the server or running the Docker container.
export OPENAI_API_KEY=sk-...your-key...
Usage
This server exposes its API via the Model Context Protocol (MCP) using FastMCP. You can call the following tools:
get_all_projects()
→ List all projectsadd_project(name)
→ Create a new project (returns full project dict)search_project_by_name(name)
→ Semantic search for a project (returns full project dict)get_random_card_by_project(project_id)
→ Get a random card from a projectadd_card(project_id, question, answer, hint=None, description=None)
→ Add a card (returns full card dict)get_all_cards_by_project(project_id)
→ List all cards in a projectsearch_cards_by_embedding(project_id, query)
→ Semantic search for cards in a projectglobal_search_cards_by_embedding(query)
→ Semantic search for cards across all projectsget_card_by_id(card_id)
→ Retrieve a card by its id
All returned objects include a type
field and never include binary embedding data.
Development
- All project and card data is stored in SQLite (
database.db
) - Embeddings are generated using OpenAI's
text-embedding-ada-002
model - The server is implemented in
main.py
anddb.py
- See
.github/copilot-instructions.md
for code and API rules
Inspector
npx @modelcontextprotocol/inspector docker 'run -e OPENAI_API_KEY=sk-...your-key... -v /<path>/storage:/app/storage --rm -i flash-card-mcp
For more details, see the code and docstrings in main.py
and db.py
.