conjurdemos/Accelerator-AI-agent-demos
If you are the rightful owner of Accelerator-AI-agent-demos and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server facilitates seamless integration between AI agents and databases, enabling efficient data querying and management through natural language processing.
AI Agent Demos
These demos show various ways of connecting to databases from Python. It installs a local Psql database server for testing. It can also use a proxied connection through CyberArk Secure Infrastructure Access (SIA) if you have one configured. By default it uses the local Psql server. All configurations are governed by the main config files for each database:
- psql-mcp.env - PostgreSQL (local & remote DBs)
- mssql-mcp.env - Microsoft SQLServer (remote DB only)
- orcl-mcp.env - Oracle (not currently working)
You can edit that file to change defaults and add your CyberArk tenant values for SIA connections.
For PostgreSQL databases, the MCP server will connect to either a local or remote DB server as determined by the LOCAL_DB boolean variable in psql-mcp.env:
- true -> local DB
- false -> remote DB using SIA configuration
The Claude Code demo requires an Anthropic subscription.
The LangGraph demo does not use any MCP server and connects directly to a local or remote PostgreSQL database. It requires OpenAI and Tavily accounts and API keys. The Tavily account is free.
The other demos do not require accounts or API keys, however Claude Desktop free accounts have token limits that once reached will reset after a few hours.
0-setup
Scripts to install common demo dependencies and start a local Psql DB server.
Directory contents
- 0-setup.sh - Installs common system dependencies
- psql CLI
- poetry package manager
- Docker engine (not desktop)
- Pulls Psql server Docker image
- Node.js
- 1-start-local-db.sh - Starts and initializes local petclinic DB
- _stop-db.sh - Stops and removes server container
- db_*.sql - SQL scripts for initializing the local DB
You can use the psql CLI and the *.sql scripts to initialize cloud dbs.
The local DB is named petclinic and initialized with a simple 3-table schema:
- pets: name, birth_date
- type: type (dog, cat, bird, ...)
- owners: first_name, last_name, address, city, telephone
Procedure
- Run: 0-setup.sh
- Run: 1-start-local-db.sh
1-mcp-server
Scripts to start/stop MCP HTTP (remote) server (and MCP Inspector on MacOS)
Once the MCP server and DB are started you can run any of the other demos. The only local DB implemented is the PostgreSQL DB in Docker.
The MCP server exposes one tool named run_sql_query which is database independent. It works with the local PostgreSQL DB or Postgres and SQLServer DBs through SIA.
Directory contents
- 0-run-server.sh - Starts the MCP Server with http access
- _stop-server.sh - Kills the MCP server
- 1-start-mcp-inspector.sh - Starts MCP Inspector on MacOS
- _stop-inspector - Kills the MCP Inspector
Procedure
- Run: 0-run-server.sh
- Run: 1-start-mcp-inspector
Installs and executes the mcp-inspector. This may take a little while.
When started, use these values to connect to the MCP server:
- Transport type: Streamable HTTP
- URL: http://localhost:< port >/mcp (displayed on server startup)
- Transport type: Streamable HTTP
Click the Connect button, then the Tools tab (top middle), then List Tools.
Click on run_sql_query tool and you can enter SQL queries in the right window.
- For example: "select * from pets;"
The Inspector is not attached to any LLM so it cannot process natural language prompts.
2-self-hosted
Runs a self-hosted LLM in Ollama as an agent for the MCP server.
Directory contents
- 0-setup.sh - Installs Ollama, go and builds mcphost
- 1-run-mcphost.sh
- Starts Ollama server, downloads llama3.2:1b model and runs mcphost client using local LLM
- Connects to MCP server over http using config in: mcphost-mcp-psql.json
Procedure
- Run: 0-setup.sh
- Run: 1-run-mcphost.sh
- Enter: natural language prompts. A simple prompt to get started is "list all pets in the db"
You can try to get the model to generate a multi-table join with the prompt: - list all pet info in the db including name, type, birthdate and all owner information for each.
The llama3.2:1b model fails miserably to generate multi-table joins where the hosted Claude Sonnet 4 model in the Claude Desktop demo succeeds impressively in discovering the database schema, generating multi-table joins and formatting the output.
You can browse for other Ollama LLMs at: https://ollama.com/search and download models with:
ollama pull model-name
Some models are tuned for specific purposes, e.g. code generation. Larger models tend to work better in general. At least one model is trained to respond as the character Lt. Worf, the Klingon officer in Star Trek:TNG. Ollama is a great way to evaluate self-hosted LLM capabilities.
3-claude-desktop
Implements a Claude Desktop Extension (MacOs and PostgreSQL support only).
Assumes you have Claude Desktop installed. If not, go here to install it: https://claude.ai/download
Directory contents
- 0-config-mcp-ext-for-claude-desktop.sh
- Modifies sql-dxt/manifest.json to run _run-server.sh
- _run-server.sh - installs dependencies and runs mcp-psql.py with stdio connection
- mcp-psql.py - the local MCP Psql server
- sql-dxt - directory containing manifest template and icon image for the extension
- pyproject.toml - Python dependency manifest
Procedure
- Run: 0-config-mcp-ext-for-claude-desktop.sh
- Import the extension into Claude Desktop
- Navigate in menu: Claude -> Settings -> Extensions -> Advanced Settings
- Under "Extension Developer" click "Install unpacked extension..."
- Navigate to sql-dxt directory and click Open and Install when asked.
- Click on Extensions in the left menu. The extension takes a few seconds to load. The screen will change to show the sql-mcp extension is enabled. It will initially show this error for ~15 seconds while the server connects to the DB:
Permissions
Error: Unable to connect to extension server. Please try disabling and re-enabling the extension.
Just wait and it should change to show the run_sql_query tool is enabled.
Once the extension is loaded you don't need to load it again unless you move or rename something that it references (see manifest.json and _run-server.sh). The extension runs its own MCP server connected via stdio. So it will start even if the HTTP MCP server is not running. You can start a new conversation (File -> New Conversation) and use natural language to query the database. The model is strikingly good at learning the DB schema and generating complex SQL queries including multi-table joins.
4-claude-code
Installs Claude Code and configures it to connect to the running SQL MCP server over http.
Directory contents
- 0-setup.sh - Adds MCP server config to Claude Code (Requires Anthropic/Claude account)
- 1-start-claude-code.sh - just runs claude. It will prompt you for your account info.
Procedure
- Run: 0-setup.sh
- Run: 1-start-claude-code.sh
5-langgraph
Runs LangGraph process with two agents that use native LangChain classes (no MCP server) to connect to Psql DBs (Requires OpenAI and Tavily accounts with API keys). ONLY WORKS FOR PostgreSQL DBs.
Directory contents
- 0-run-agent.sh - installs dependencies and runs lg-psql.py
- lg-psql.py - the monolithic process that runs two agents, one for the DB, one for the search engine.
Procedure
- Run: 0-run-agent.sh
Once the process finishes startup it will present a "User:" prompt. You can enter natural language prompts and the LangGraph orchestrator will decide how to process it. Depending on how the prompt is worded, it may choose to use the Tavily search engine or the SQL agent. Sometimes you have to include a reference to the database to guide it to use the SQL agent.