docs

hypermodel-labs/docs

3.3

If you are the rightful owner of docs and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Hypermodel provides a seamless way to enhance coding agents with contextually relevant and auto-updated documentation.

Tools
3
Resources
0
Prompts
0

Hypermodel

npm version License: MIT

Overview

Context is everything for coding agents.

Make your coding agents better with the right documentation, auto-updated for you and in context, all the time.

Auto-install in one command

npx -y -p @hypermodel/cli add-docs claude

Usage with an AI coding agent like Claude Code or Amp

Use @hypermodel/docs MCP server with your favourite AI coding agent

Common Usage Patterns

Examples:

  1. can you call search the docs on how to use "contact" objects ?

  2. Explain amp.tools.stopTimeout and its default in the docs of ampcode. use docs tool

Quick Start Flow

  1. Link to your scope (optional): Use link tool to associate with a user or team. Default scope is 'user'.
  2. Check available docs: Use list-docs to see what documentation is available in your current scope.
  3. Search documentation: Use search-docs with the index name, your query, and optional result count.
  4. Create new indices (if needed): Use index tool to index a new documentation source if not already present.

Tip: Use the base documentation URL (https://supabase.com/docs) instead of an inner link (https://supabase.com/docs/guides/functions/dependencies)

Scope and Access Management

Scope Types

  • user (default): Personal documentation access
  • team: Shared team documentation access

Linking Workflow

  1. Link to a user: link tool with user identifier
  2. Link to a team: link tool with team identifier and scope='team'
  3. Once linked, all list-docs, search-docs, and index operations work within that scope
  4. Each user has access to docs based on their permissions and scope context

Embedding Provider Configuration

The system supports both OpenAI and Google Gemini for generating embeddings. By default, it uses OpenAI, but you can configure it to use Gemini instead.

Environment Variables

VariableDescriptionDefault Value
EMBEDDING_PROVIDERChoose embedding provider: openai or geminiopenai
OPENAI_API_KEYOpenAI API key (required when using OpenAI)-
OPENAI_EMBEDDING_MODELOpenAI embedding modeltext-embedding-3-small
OPENAI_EMBEDDING_DIMENSIONSOpenAI embedding dimensions512
GEMINI_API_KEYGoogle Gemini API key (required when using Gemini)-
GEMINI_EMBEDDING_MODELGemini embedding modelgemini-embedding-001
GEMINI_EMBEDDING_DIMENSIONSGemini embedding dimensions768

Examples

To use OpenAI (default):

export OPENAI_API_KEY="your-openai-key"
# EMBEDDING_PROVIDER defaults to 'openai'

To use Google Gemini:

export EMBEDDING_PROVIDER="gemini"
export GEMINI_API_KEY="your-gemini-key"

Note: When switching embedding providers, ensure that existing vector stores are compatible with the new embedding dimensions, or re-index your documentation with the new provider.

Tools exposed for your coding agents

ToolDescriptionOutput / Result
linkLink to a user or team to set your scope contextLinks your session to a user/team for scoped documentation access
list-docsCheck what documentation is available in your current scope{ "indexes": ["ampcode-com", "developer-salesforce-com"] }
search-docsSearch documentation for answers in your current scopeReturns ranked results with URLs, titles, snippets, and relevance scores
indexAdd a new documentation source if not already presentCreates searchable index from the documentation site
index-statusCheck detailed status and progress of indexing jobsReal-time progress, duration, error details for indexing workflows
list-indexing-jobsList recent indexing jobs for your current scopeHistory of indexing jobs with status, progress, and timing info

Tips for Best Results

  • Use natural language queries with the terms search docs anywhere in your prompt for the MCP tool to be used.
  • Start with broader queries, then narrow down based on results
  • Results include relevance scores to help identify the most useful content
All IDEs supported
  • Cursor (cursor)
    • Example:
      npx -y -p @hypermodel/cli add-docs cursor
      
  • Vscode (vscode)
    • Example:
      npx -y -p @hypermodel/cli add-docs vscode
      
  • Ampcode (amp)
    • Example:
      npx -y -p @hypermodel/cli add-docs amp
      
Don't see your IDE?

Docker Compose Setup Guide

This guide explains how to run all docs-mcp services using Docker Compose.

Services

The docker-compose configuration includes:

  1. mcp-server - Main MCP server (port 3001)
  2. api-server - PDF extraction API (port 3000)
  3. temporal-worker - Temporal workflow worker scaled to 5 instances (no exposed port)

Note: PostgreSQL database is expected to be hosted externally (e.g., Amazon RDS with pgvector extension).

Prerequisites
  • Docker and Docker Compose installed
  • Environment variables configured (see below)
Environment Variables

Create a .env file in the project root with the following variables:

Required Variables
# OpenAI API Key (required)
OPENAI_API_KEY=your-openai-api-key-here

# PostgreSQL Connection String (required - should point to your Amazon RDS instance with pgvector)
POSTGRES_CONNECTION_STRING=postgresql://username:password@your-rds-endpoint:5432/database_name

# Temporal Configuration (required for temporal-worker)
TEMPORAL_ADDRESS=your-temporal-address:7233
Optional Variables
# Embedding Provider Configuration
EMBEDDING_PROVIDER=openai  # or 'gemini'
OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_EMBEDDING_DIMENSIONS=512

# Google Gemini (if using EMBEDDING_PROVIDER=gemini)
GEMINI_API_KEY=your-gemini-api-key-here
GEMINI_EMBEDDING_MODEL=gemini-embedding-001
GEMINI_EMBEDDING_DIMENSIONS=768

# WorkOS (for authentication)
WORKOS_API_KEY=your-workos-api-key-here

# Temporal Additional Configuration
TEMPORAL_NAMESPACE=default
TEMPORAL_TLS_CERT=
TEMPORAL_TLS_KEY=
TEMPORAL_API_KEY=

# Documentation Crawler Configuration
DOCS_MAX_PAGES=1000
DOCS_CONCURRENCY=5
DOCS_TIMEOUT_MS=30000
DOCS_USER_AGENT=HypermodelDocsBot/1.0
DOCS_INCLUDE_REGEX=
DOCS_EXCLUDE_REGEX=
Database Setup

Make sure your Amazon RDS PostgreSQL instance has:

  • pgvector extension enabled: Run CREATE EXTENSION IF NOT EXISTS vector; on your database
  • Proper security group rules: Allow connections from your Docker host
  • Connection string in .env: Set POSTGRES_CONNECTION_STRING with your RDS credentials

Usage

Start All Services

docker-compose up -d
Start Specific Services
# Start only MCP server
docker-compose up -d mcp-server

# Start only API server
docker-compose up -d api-server

# Start only temporal workers (all 5 instances)
docker-compose up -d temporal-worker
Scale Workers Dynamically

The temporal worker is configured to run 5 instances by default, but you can adjust this:

# Scale to a different number of workers
docker-compose up -d --scale temporal-worker=10

# Scale back down
docker-compose up -d --scale temporal-worker=3
View Logs
# All services
docker-compose logs -f

# Specific service
docker-compose logs -f mcp-server
docker-compose logs -f api-server

# All temporal worker instances
docker-compose logs -f temporal-worker
Stop Services
docker-compose down
Stop and Remove Volumes
docker-compose down -v
Service Access

Once running, you can access:

Building Images

If you make changes to the code, rebuild the images:

# Rebuild all images
docker-compose build

# Rebuild specific service
docker-compose build mcp-server
docker-compose build api-server
docker-compose build temporal-worker

Troubleshooting

Services won't start

Check logs for specific error messages:

docker-compose logs -f [service-name]
Database connection issues

Verify your RDS connection string is correct and accessible:

# Test connection from your Docker host
psql "$POSTGRES_CONNECTION_STRING" -c "SELECT version();"

# Check if pgvector extension is installed
psql "$POSTGRES_CONNECTION_STRING" -c "SELECT * FROM pg_extension WHERE extname = 'vector';"

Common issues:

  • Security groups: Ensure your RDS security group allows inbound connections from your Docker host IP
  • VPC settings: If using VPC, ensure proper network configuration
  • Credentials: Verify username, password, and database name in connection string

Environment variable issues

Verify your .env file exists and contains required variables:

cat .env
Reset Everything

To restart all services:

docker-compose down
docker-compose up -d

Note: This does not affect your Amazon RDS database. Database data persists independently.

Production Considerations

For production deployments, consider:

  1. Use secrets management for API keys instead of .env files (AWS Secrets Manager, HashiCorp Vault, etc.)
  2. Enable SSL/TLS for RDS connections and update connection string accordingly
  3. Configure resource limits for each service in docker-compose.yml
  4. Set up RDS automated backups and point-in-time recovery
  5. Use a proper Temporal server (Temporal Cloud or self-hosted cluster)
  6. Implement monitoring and logging solutions (CloudWatch, Datadog, etc.)
  7. Use a reverse proxy (nginx/traefik/ALB) for HTTPS
  8. Scale workers dynamically based on Temporal queue depth
  9. Use container orchestration (ECS, Kubernetes) instead of plain docker-compose
  10. Implement health checks and auto-restart policies
  11. Set up VPC peering if services and RDS are in different VPCs
  12. Use RDS Proxy for connection pooling with multiple worker instances

Architecture

                         ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                         │   Temporal Cloud     │
                         │   (or self-hosted)   │
                         ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                    │
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”                │
│   MCP Server    │ :3001          │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜                │
         │                          │
         │                ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā–¼ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā–¼ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”       │  Temporal Workers  │
│   API Server    │       │   (5 instances)    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜       ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”¬ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
         │ :3000                     │
         │                           │
         │                           │
         │         ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā–¼ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
         └────────►│  Amazon RDS PostgreSQL       │
                   │  with pgvector extension     │
                   ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
  • Services communicate over the docs-mcp-network Docker network
  • All services connect to external Amazon RDS database
  • Temporal workers (5 instances) process indexing jobs in parallel
  • Workers scale independently from API/MCP servers

Contributing

Looking to contribute? All kinds of help is highly appreciated.

Checkout our contribution for more.