mbroz2/liberty-mcp-server
If you are the rightful owner of liberty-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Liberty MCP Server project demonstrates the use of Liberty as a Model Context Protocol (MCP) Server, showcasing how AI applications can leverage external business logic through MCP tools.
Liberty MCP Server
This project demonstrates Liberty servings as a Model Context Protocol (MCP) Server (and Quarkus as the MCP Client). The demo showcases how AI applications can leverage external business logic (new or existing) through the use of MCP tools.
Project Overview
The demo consists of two main components:
- Liberty MCP Server: Provides a weather forecast tool that can be called by MCP clients
- Quarkus MCP Client: A chatbot application that uses the weather forecast tool through MCP
The client application allows users to ask weather-related questions in natural language. The AI model processes these questions and uses the MCP tool provided by the Liberty server to retrieve weather forecasts.
Prerequisites
- Java 17+
- Ollama or OpenAI API Key
- (Optional) Maven 3.8.1+
- Alternatively use the provided Maven wrapper via
./mvnw
ormvnw.cmd
- Alternatively use the provided Maven wrapper via
Getting Started
1. Start the Quarkus Client
cd mcp-client
./mvnw quarkus:dev
The Quarkus application will start on port 8080.
2. Start the Liberty Server
cd mcp-liberty-server
./mvnw liberty:dev
The Liberty server will start on port 9080.
3. Access the Application
- Open http://localhost:8080/ in your browser
- Click the chat icon in the bottom right corner to start a conversation
- Ask weather-related questions like:
- What's the 3 day weather forecast for Maui, Hawaii?
- Will I need an umbrella this week in Austin, TX?
- Will it snow in the next 4 days in Toronto, Canada?
- Who's going to see more rainfall this week, Maui, Hawaii or Seattle, Washington?
How It Works
- The user sends a weather-related query to the Quarkus client
- The client uses an LLM (via Ollama or OpenAI) to process the query
- The LLM determines whether it needs weather data and if so calls the MCP tool
- The Liberty server receives the tool request and calls the Open-Meteo API
- The weather data is returned to the MCP client
- The LLM formats the response and presents it to the user
Project Structure
mcp-client/
: Quarkus MCP client with an application providing an AI chatbot interfacemcp-liberty-server/
: Liberty MCP server providing the weather forecast tool
See the README files in each directory for more details about the specific components.
Architecture
---
config:
flowchart:
subGraphTitleMargin:
top: 10
bottom: 18
---
flowchart TD
LLM["**LLM**<br>(Ollama)"]
UI["**Chatbot UI**<br>(Browser)"]
Client["**MCP Client**<br>(Quarkus)"]
subgraph Server["**MCP Server**<br>(Liberty)"]
Tool["**MCP Tool**<br>(getForecast)"]
end
API["**Weather Data**<br>(Open-Meteo API)"]
LLM <--"REST"--> Client
UI <--"WS"--> Client
Client <--"MCP"--> Server
Server <--"REST"--> API
classDef component fill:#f9f9f9,stroke:#333,stroke-width:1px,color:black;
class LLM,UI,Client,API component;
classDef serverComponent fill:#f9f9f9,stroke:#333,stroke-width:1px,color:black;
class Server serverComponent;
classDef toolComponent fill:#f0f0f0,stroke:#666,stroke-width:1px,color:black;
class Tool toolComponent;