arxiv_mcp_server

Prajapdh/arxiv_mcp_server

3.2

If you are the rightful owner of arxiv_mcp_server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Model Context Protocol (MCP) server provides a structured interface for searching and exploring academic papers, specifically from the arXiv database, through a standardized protocol that connects large language models to external tools and resources.

Tools
  1. search_papers

    Search for papers on arXiv based on a topic

  2. extract_info

    Get detailed information about a specific paper

Academic Paper Search with MCP

This project demonstrates the Model Context Protocol (MCP) architecture with a server that provides academic paper search capabilities and a client that connects to multiple MCP servers in a chatbot interface.

Project Overview

This application consists of:

  1. MCP Server (ResearchServer.py): Provides tools, resources, and prompts for searching and exploring academic papers from arXiv.
  2. MCP Client (MCPChatbotWithMultipleServers.py): A chatbot interface that can connect to multiple MCP servers and use their tools, resources, and prompts.

Features

MCP Server

  • šŸ” Search Papers: Search for academic papers on arXiv by topic
  • šŸ“‹ Extract Info: Get detailed information about specific papers
  • šŸ“‚ Browse Resources: View paper topics and contents through URI-based resources
  • šŸ“ Prompts: Generate structured prompts for research tasks

MCP Client

  • šŸ¤– Chatbot Interface: Natural language interface to the MCP servers
  • 🌐 Multi-Server Support: Connect to multiple MCP servers simultaneously
  • 🧰 Resource Navigation: Browse paper topics with @ commands
  • ⚔ Prompt Execution: Run predefined prompts with / commands

Prerequisites

  • Python 3.9+
  • Node.js (for the MCP Inspector)

Installation

  1. Clone this repository:
git clone https://github.com/Prajapdh/arxiv_mcp_server.git
cd arxiv_mcp_server
  1. Install dependencies:
pip install -r requirements.txt
  1. Create a .env file with your Anthropic API key:
ANTHROPIC_API_KEY=your_api_key_here
  1. Configure your servers in server_config.json:
{
  "mcpServers": {
    "research": {
      "command": "uv",
      "args": ["run", "ResearchServer.py"],
      "env": null
    }
  }
}

Running the MCP Server

Run with stdio transport (default)

uv run ResearchServer.py

Run with MCP Inspector for debugging

npx @modelcontextprotocol/inspector uv run ResearchServer.py

Running the MCP Client

uv run MCPChatbotWithMultipleServers.py

Chatbot Commands

  • Regular query: Type any text to chat with the assistant
  • @folders: List all available paper topic folders
  • @<topic>: Browse papers in a specific topic
  • /prompts: List all available prompts
  • /prompt <name> <arg1=value1>: Execute a specific prompt with arguments
  • exit: Exit the chatbot

Server Features

Tools

ToolDescription
search_papersSearch for papers on arXiv based on a topic
extract_infoGet detailed information about a specific paper

Resources

ResourceDescription
papers://foldersList all available topic folders
papers://{topic}Get detailed information about papers in a specific topic

Prompts

PromptDescription
generate_search_promptGenerate a prompt for Claude to find and discuss academic papers

Project Structure

DeepLearning MCP tutorial/
ā”œā”€ā”€ MCPChatbotWithMultipleServers.py  # Multi-server MCP client
ā”œā”€ā”€ ResearchServer.py                 # ArXiv paper search MCP server
ā”œā”€ā”€ server_config.json                # Server connection configuration
ā”œā”€ā”€ requirements.txt                  # Project dependencies
ā”œā”€ā”€ papers/                           # Directory for storing paper information
│   └── {topic}/                      # Topic-specific directories
│       └── papers_info.json          # Stored paper metadata
└── README.md                         # This file

Understanding MCP

The Model Context Protocol (MCP) is a standard for connecting large language models (LLMs) to external tools and resources. In this project:

  • The MCP server exposes tools (functions), resources (data sources), and prompts (templates)
  • The MCP client connects to these servers and enables the LLM to use their capabilities
  • Communication happens via either stdio (for local use) or SSE (for network use)

Extending the Project

  • Add more MCP servers with different capabilities
  • Implement additional paper analysis tools
  • Add support for other academic databases
  • Create visualization tools for research data

Troubleshooting

  • If you encounter connection errors, ensure the server is running before starting the client
  • For SSE transport issues, check port availability or switch to stdio transport
  • Verify your Anthropic API key is correctly set in the .env file

License