dev-johnny-gh/mcp-server-demo
If you are the rightful owner of mcp-server-demo and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This document provides a comprehensive guide on setting up and using a Model Context Protocol (MCP) server with LibreChat and Ollama.
Installation
-
cd IpServer && npm install && npm run build && npm run start
-
install a local mongodb server and serve it on mongodb://127.0.0.1:27017
-
cd LibreChat && git clone :danny-avila/LibreChat.git && mv .env.example .env && npm install && npm run frontend && npm run backend
-
add following configuration to your librechat.yaml file:
mcpServers:
ipServer:
# type: sse # type can optionally be omitted
url: http://localhost:3000/sse
timeout: 60000 # 1 minute timeout for this server, this is the default timeout for MCP servers.
endpoints:
custom:
- name: "Ollama"
apiKey: "ollama"
# use 'host.docker.internal' instead of localhost if running LibreChat in a docker container
baseURL: "http://localhost:11434/v1/chat/completions"
models:
default:
[
"qwen2.5:3b-instruct-q4_K_M",
"mistral:7b-instruct-q4_K_M",
"gemma:7b-instruct-q4_K_M",
]
# fetching list of models is supported but the `name` field must start
# with `ollama` (case-insensitive), as it does in this example.
fetch: true
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "Ollama"
- download and run ollama, download a model from https://ollama.ai/models/ and serve ollama on http://localhost:11434/
Usage
-
Visit http://localhost:3080/ to see the LibreChat UI.
-
Create a new agent with the name "Ollama" and select the ollama as the model provider and select a model
-
Click on the Add Tools button below and add the get-external-ip, get-local-ip-v6, get-external-ip-v6, get-local-ip tools
-
Ask agent what's my local ip address? / what's my external ip address? / what's my external ipv6 address? / what's my internal ipv6 address?
-
Agent should invoke your tools and return the results.