BillFarber/springboot-mcp-server
If you are the rightful owner of springboot-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
A SpringBoot-based MCP server providing AI-powered tools and resources through a standardized protocol interface.
SpringBoot MCP Server
A Model Context Protocol (MCP) server built with SpringBoot and SpringAI, providing AI-powered tools and resources through a standardized protocol interface.
Features
- MCP Protocol Compliance: Full implementation of the Model Context Protocol specification
- AI-Powered Tools: Text generation and data analysis using SpringAI
- RESTful API: HTTP-based MCP communication
- Extensible Architecture: Easy to add new tools and resources
- Production Ready: Built with SpringBoot best practices
Available Tools
1. Text Generation (generate_text)
Generate text content using AI based on prompts.
Parameters:
prompt(string, required): The text prompt to generate content frommaxTokens(integer, optional): Maximum number of tokens to generate (default: 100)
2. Optic Code Generator (optic_code_generator)
Generate optic code snippets for data transformation - inspired by Rush's many talents.
Parameters:
schema(string, optional): The schema name to use in the optic code (default: "schema")view(string, optional): The view name to use in the optic code (default: "view")
3. Optic Code Verifier (verify_optic_code)
Verify optic code for syntax and logical correctness (rebellious random verification).
Parameters:
optic_code(string, required): The optic code to verify for syntax and validity
4. MarkLogic Documentation Helper (marklogic_docs)
Help you out with MarkLogic documentation and guidance.
Parameters:
prompt(string, required): The user prompt describing what MarkLogic help you need
Available Resources
mcp://server/info: Server information and capabilitiesmcp://tools/examples: Examples of how to use the available tools
Prerequisites
- Java 17 or higher
- Gradle 8.5 or higher
Getting Started
1. Clone and Build
./gradlew build
2. Configure AI (Optional)
To enable AI features, you have several options for configuring your Azure OpenAI credentials:
Option A: Using .env file (Recommended for development)
Create a .env file in the project root directory:
# Azure OpenAI Configuration - Keep this file secret!
AZURE_OPENAI_API_KEY=your-azure-openai-api-key
AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/
AZURE_OPENAI_DEPLOYMENT_NAME=gpt-35-turbo
Important: The .env file is automatically ignored by git and won't be committed to version control.
Option B: Using environment variables
export AZURE_OPENAI_API_KEY=your-azure-openai-api-key
export AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/
export AZURE_OPENAI_DEPLOYMENT_NAME=gpt-35-turbo
Option C: Direct configuration (Not recommended for production)
Set your credentials directly in src/main/resources/application.properties:
spring.ai.azure.openai.api-key=your-azure-openai-api-key
spring.ai.azure.openai.endpoint=https://your-resource-name.openai.azure.com/
spring.ai.azure.openai.chat.options.deployment-name=gpt-35-turbo
spring.ai.azure.openai.chat.options.model=gpt-3.5-turbo
3. Run the Server
./gradlew bootRun
The server will start on http://localhost:8080
4. Test the Server
Check server health:
curl -s http://localhost:8080/mcp/health | jq .
Get server capabilities:
curl -s http://localhost:8080/mcp/capabilities | jq .
Docker Deployment
The MCP server can be easily containerized and deployed using Docker. The build uses Spring Boot's Cloud Native Buildpacks for optimized, production-ready images.
1. Build Docker Image
Build the Docker image using Gradle's bootBuildImage task:
./gradlew clean bootBuildImage
This will create a Docker image named epic-mcp-server:latest (approximately 293MB) optimized for x86_64 architecture.
2. Start with Docker Compose
Start the containerized MCP server using Docker Compose:
docker compose up
Or run in the background:
docker compose up -d
The server will be available at http://localhost:8080
3. Test the Containerized Server
Once the container is running, test the MCP server with these curl commands:
Check Server Health
curl -s http://localhost:8080/mcp/health
List Available Tools
curl -H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' \
http://localhost:8080/mcp
Test Text Generation Tool
curl -H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"generate_text","arguments":{"prompt":"Write a haiku about Docker containers","maxTokens":50}}}' \
http://localhost:8080/mcp
Test Optic Code Generator
curl -H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"optic_code_generator","arguments":{"prompt":"Create optic code to read user profiles from the database"}}}' \
http://localhost:8080/mcp
4. Stop the Container
To stop the Docker container:
docker compose down
5. Alternative Docker Commands
You can also run the image directly with Docker:
# Run the container directly
docker run -p 8080:8080 epic-mcp-server:latest
# Run in the background
docker run -d -p 8080:8080 --name epic-mcp-server epic-mcp-server:latest
# Stop the container
docker stop epic-mcp-server
docker rm epic-mcp-server
Current Status
✅ MCP Server Working: The server implements the full MCP protocol and responds to all tool calls ✅ Mock AI Responses: Tools provide mock responses when AI is not configured ⚠️ Azure OpenAI Integration: Currently providing mock responses (SpringAI auto-configuration needs debugging)
Testing
The server is fully functional for MCP protocol testing. All tools work with mock responses. MCP Tool Calls
List Available Tools
curl -s -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": "1",
"method": "tools/list",
"params": {}
}' | jq .
Test Text Generation Tool
curl -s -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": "2",
"method": "tools/call",
"params": {
"name": "generate_text",
"arguments": {
"prompt": "Write a haiku about coding",
"maxTokens": 50
}
}
}' | jq .
List Available Resources
curl -s -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": "3",
"method": "resources/list",
"params": {}
}' | jq .
Read a Resource
curl -s -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": "4",
"method": "resources/read",
"params": {
"uri": "mcp://tools/examples"
}
}' | jq .
Initialize MCP Connection
curl -s -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": "0",
"method": "initialize",
"params": {}
}' | jq .
MCP Client Integration
Initialize Connection
POST /mcp
{
"jsonrpc": "2.0",
"id": "1",
"method": "initialize",
"params": {}
}
List Available Tools
POST /mcp
{
"jsonrpc": "2.0",
"id": "2",
"method": "tools/list",
"params": {}
}
Call a Tool
POST /mcp
{
"jsonrpc": "2.0",
"id": "3",
"method": "tools/call",
"params": {
"name": "generate_text",
"arguments": {
"prompt": "Write a haiku about programming",
"maxTokens": 50
}
}
}
List Resources
POST /mcp
{
"jsonrpc": "2.0",
"id": "4",
"method": "resources/list",
"params": {}
}
Read a Resource
POST /mcp
{
"jsonrpc": "2.0",
"id": "5",
"method": "resources/read",
"params": {
"uri": "mcp://server/info"
}
}
Development
Adding New Tools
- Update the
listTools()method inMcpServiceto include your new tool - Add a case for your tool in the
callTool()method - Implement the tool logic as a private method
Adding New Resources
- Update the
listResources()method inMcpService - Add a case for your resource in the
readResource()method - Implement the resource reading logic
Running Tests
./gradlew test
Project Structure
src/
├── main/
│ ├── java/com/example/mcpserver/
│ │ ├── controller/ # REST API controllers
│ │ ├── model/ # MCP protocol models
│ │ ├── service/ # Business logic
│ │ └── McpServerApplication.java
│ └── resources/
│ └── application.properties
└── test/
└── java/com/example/mcpserver/
└── McpServerApplicationTests.java
Configuration
Key configuration options in application.properties:
server.port: Server port (default: 8080)spring.ai.azure.openai.api-key: Azure OpenAI API key for AI featuresspring.ai.azure.openai.endpoint: Azure OpenAI service endpointspring.ai.azure.openai.chat.options.deployment-name: Your deployment namespring.ai.azure.openai.chat.options.model: AI model to use- Logging levels for debugging
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Submit a pull request
License
This project is licensed under the MIT License.