mrfelixwong/spotify_llm
If you are the rightful owner of spotify_llm and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Spotify LLM Agent is a fully MCP compliant music assistant that integrates OpenAI's language models with Spotify's API through a client-server architecture.
Spotify LLM Agent with MCP Compliance
A fully MCP (Model Context Protocol) compliant music assistant that combines OpenAI's language models with Spotify's API through a proper client-server architecture.
Features
- Full MCP Compliance: Implements the actual MCP specification with JSON-RPC 2.0
- Natural Language Interface: Convert natural language requests to Spotify API calls
- Multiple Tools: Search artists, get top tracks, access playlists, and more
- Backward Compatibility: Legacy endpoint still works for existing clients
- Error Handling: Proper error responses and graceful failure handling
MCP Tools Available
Tool | Description | Parameters |
---|---|---|
search_artist | Search for an artist by name | artist (string) |
get_artist_top_tracks | Get top tracks for an artist | artist_id (string) |
search_artist_and_get_top_tracks | Search artist and get their top tracks | artist (string) |
get_user_top_tracks | Get user's personal top tracks | limit (integer, optional) |
get_playlist_tracks | Get tracks from a playlist | playlist_id (string) |
Architecture
MCP Server (mcp_server.py
)
- FastAPI-based server implementing MCP specification
- JSON-RPC 2.0 protocol compliance
- Tool definitions with proper schemas
- Error handling with standard MCP error codes
MCP Client (spotify_llm_agent.py
)
- OpenAI integration for natural language processing
- MCP client class for proper protocol communication
- Fallback support for legacy endpoint
- Rich output formatting for user-friendly display
Quick Start
1. Setup Environment
# Install dependencies
pip install openai spotipy fastapi uvicorn requests python-dotenv
# Set up environment variables
cp .env.example .env
# Edit .env with your API keys
2. Start the MCP Server
uvicorn mcp_server:app --reload
Server will start on http://localhost:8000
3. Run the Client
python spotify_llm_agent.py
4. Test the MCP Implementation
python test_mcp.py
Usage Examples
Natural Language Commands
> Find top tracks by Taylor Swift
> What are my most listened songs?
> Search for artist Drake
> Show me tracks in playlist 37i9dQZF1DXcBWIGoYBM5M
Direct MCP Tool Calls
# Initialize MCP connection
mcp_client = MCPClient()
mcp_client.initialize()
# List available tools
tools = mcp_client.list_tools()
# Call a tool
result = mcp_client.call_tool("search_artist", {"artist": "Taylor Swift"})
MCP Protocol Implementation
JSON-RPC 2.0 Messages
Initialize Request:
{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize",
"params": {}
}
Tool Call Request:
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "search_artist",
"arguments": {"artist": "Taylor Swift"}
}
}
Tool List Response:
{
"jsonrpc": "2.0",
"id": 3,
"result": {
"tools": [
{
"name": "search_artist",
"description": "Search for an artist by name",
"inputSchema": {...}
}
]
}
}
Backward Compatibility
The server maintains backward compatibility with the old /mcp/invoke
endpoint:
# Legacy format still works
response = requests.post("http://localhost:8000/mcp/invoke", json={
"action": "search_artist",
"params": {"artist": "Taylor Swift"}
})
Testing
Run the test script to verify MCP compliance:
python test_mcp.py
This will test:
- ✅ MCP initialization
- ✅ Tool discovery
- ✅ Tool execution
- ✅ Error handling
- ✅ Legacy endpoint compatibility
🔍 MCP vs Previous Implementation
Feature | Previous | MCP Compliant |
---|---|---|
Protocol | Custom HTTP | JSON-RPC 2.0 |
Tool Discovery | None | tools/list method |
Error Codes | Custom | Standard MCP codes |
Tool Schemas | None | JSON Schema definitions |
Client Integration | Simple HTTP | Full MCP client |
Extensibility | Limited | High |
Development
Adding New Tools
- Define the tool schema in
MCP_TOOLS
- Create a handler function in the server
- Register the handler in
TOOL_HANDLERS
- Update the LLM prompt in the client
Example: Adding a new tool
# In mcp_server.py
MCP_TOOLS.append({
"name": "search_tracks",
"description": "Search for tracks by name",
"inputSchema": {
"type": "object",
"properties": {
"query": {"type": "string"}
},
"required": ["query"]
}
})
def handle_search_tracks(arguments: Dict[str, Any]) -> Dict[str, Any]:
# Implementation here
pass
TOOL_HANDLERS["search_tracks"] = handle_search_tracks
Environment Variables
Required environment variables in .env
:
SPOTIFY_CLIENT_ID=your_spotify_client_id
SPOTIFY_CLIENT_SECRET=your_spotify_client_secret
SPOTIFY_REDIRECT_URI=http://localhost:8888/callback
OPENAI_API_KEY=your_openai_api_key
Contributing
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure MCP compliance
- Submit a pull request
License
MIT License - see LICENSE file for details.