langoustine_mcp

manukall/langoustine_mcp

3.1

If you are the rightful owner of langoustine_mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Langoustine MCP Server is an intelligent Model Context Protocol server designed to enhance interactions with AI coding assistants by learning and applying developer preferences.

Tools
2
Resources
0
Prompts
0

Langoustine MCP Server

⚠️ Work in Progress:
Langoustine MCP Server is under active development. Features, APIs, and behaviors may change.

An intelligent Model Context Protocol (MCP) server that automatically learns and applies developer coding preferences to enhance LLM interactions in coding environments like Cursor, Claude Code, and other AI assistants.

What is Langoustine?

Langoustine is an MCP server that seamlessly integrates with coding agents to automatically track, learn, and apply developer preferences without requiring explicit instruction management. It helps AI assistants remember your coding style, patterns, and preferences across sessions.

Key Benefits

🧠 Automatic Learning: Recognizes when you give generalizable instructions and stores them for future use
🎯 Context-Aware: Retrieves relevant coding rules based on your current development task
🔄 Continuous Improvement: Tracks how often you need to repeat instructions and prioritizes frequently needed rules
📊 Vector-Based Matching: Uses semantic similarity to find the most relevant guidelines for your current context
🗄️ Persistent Memory: Stores your preferences in a local SQLite database that persists across sessions

Features

Core Capabilities

  • Smart Instruction Recognition: Automatically detects generalizable developer instructions
  • Rule Generation: Transforms specific instructions into reusable, abstract rules
  • Context-Aware Retrieval: Finds relevant rules based on semantic similarity to current tasks
  • Usage Tracking: Monitors how often rules are applied to improve relevance scoring
  • Vector Embeddings: Uses OpenAI embeddings for intelligent rule matching

MCP Tools

Langoustine provides two main tools for AI assistants:

  1. rememberDeveloperInstruction: Stores new developer instructions and generates corresponding rules
  2. getRelevantRules: Retrieves relevant coding rules based on the current development context

Example Use Cases

  • "Always use TypeScript for new files" → Applied when creating new files
  • "Don't mock internal modules" → Applied when writing unit tests
  • "Use PascalCase for components" → Applied when working on React components
  • "Add error handling to API calls" → Applied when implementing API integrations
  • "Follow DRY principles" → Applied across all development contexts

Installation

Prerequisites

  • Node.js: Version 16 or higher
  • OpenAI API Key: Required for rule generation and embeddings

Setup

  1. Clone the repository:

    git clone <repository-url>
    cd langoustine-mcp
    
  2. Install dependencies:

    npm install
    
  3. Build the project:

    npm run build
    
  4. Set up your OpenAI API key:

    export LANGOUSTINE_MCP_OPENAI_API_KEY=your-api-key-here
    

Usage

Basic Usage

Start the MCP server:

# Using npm scripts (recommended)
npm start

Configuration Options

Command Line Arguments
# Specify custom database path
node build/index.js --db /path/to/custom/database.db

# Use different OpenAI models
node build/index.js --llm-model gpt-4 --embedding-model text-embedding-ada-002

# Adjust retry settings
node build/index.js --llm-max-retries 5 --embedding-max-retries 3
Environment Variables
# Database location
export LANGOUSTINE_DB_PATH="/path/to/database.db"

# OpenAI configuration
export LANGOUSTINE_MCP_OPENAI_API_KEY="your-api-key"
export LLM_MODEL="gpt-4"
export OPENAI_EMBEDDING_MODEL="text-embedding-3-small"

# Retry configuration
export LLM_MAX_RETRIES=3
export EMBEDDING_MAX_RETRIES=3
Available Options
OptionEnvironment VariableDefaultDescription
--db, --databaseLANGOUSTINE_DB_PATH./.langoustine/langoustine.dbDatabase file path
--openai-api-keyLANGOUSTINE_MCP_OPENAI_API_KEY-OpenAI API key (required)
--llm-modelLLM_MODELgpt-5-mini-2025-08-07LLM model for rule generation
--embedding-modelOPENAI_EMBEDDING_MODELtext-embedding-3-smallEmbedding model for similarity
--llm-max-retriesLLM_MAX_RETRIES3Maximum LLM retry attempts
--embedding-max-retriesEMBEDDING_MAX_RETRIES3Maximum embedding retry attempts

Integration with Cursor

To use Langoustine with Cursor, you'll need to configure it as an MCP server in your Cursor settings. The server communicates via stdio and provides the tools mentioned above to enhance your coding experience. Here is an example configuration:

  "mcpServers": {
    "langoustine": {
      "command": "npx",
      "args": [
        "langoustine-mcp",
        "--db",
        "/path/to/your/project/.langoustine.db"
      ],
      "env": {
        "LANGOUSTINE_MCP_OPENAI_API_KEY": "your-api-key"
      }
    },
    ...
  }
}

It also helps to create a Cursor rule that instructs the assistant to use the Langoustine MCP server.

Help

Display help information:

node build/index.js --help

Development

Development Setup

# Install dependencies
npm install

# Run in development mode (builds and starts)
npm run dev

# Or run with tsx for faster development
npm run tsx

Testing

# Run unit tests
npm test

# Run integration tests (requires LANGOUSTINE_MCP_OPENAI_API_KEY)
npm run test:integration

# Run all tests
npm run test:all

Code Quality

# Lint code
npm run lint

# Format code
npm run format

# Check formatting
npm run format:check

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass
  6. Submit a pull request

License