Cursor-history-MCP

Cursor-history-MCP

3.3

If you are the rightful owner of Cursor-history-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Vectorize your Cursor chat history and serve it via a simple search API.

The Cursor Chat History Vectorizer & Dockerized Search MCP is a project designed to make your Cursor chat history searchable and usable for Retrieval Augmented Generation (RAG) or other LLM-based analysis. It involves extracting chat history from local Cursor IDE data, generating text embeddings for user prompts using a local Ollama instance, and storing these embeddings in a LanceDB vector database. The project includes a Dockerized FastAPI application, referred to as an MCP server, to search this LanceDB database via a simple API endpoint. The primary goal is to convert user prompts into vector embeddings stored efficiently in LanceDB and provide a simple and accessible API server to perform vector similarity searches against your vectorized history.

Features

  • Data Extraction: Scans specified Cursor workspace storage paths for `state.vscdb` SQLite files.
  • Prompt Extraction: Extracts user prompts from the `aiService.prompts` key within the database files.
  • Embedding Generation: Uses a locally running Ollama instance to generate embeddings for extracted prompts.
  • Vector Database Storage: Stores original text, source file, role, and vector embeddings in a LanceDB database.
  • Dockerized Search: Includes a `Dockerfile` to build a container for the Fast search server.