agent-with-mcp-example

BenedatLLC/agent-with-mcp-example

3.3

If you are the rightful owner of agent-with-mcp-example and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project provides a simple example of an Agent and a local MCP server for obtaining system CPU and memory statistics.

Agent with MCP Example

This project provides a simple example of an Agent and a local MCP server.

The MCP Server provides a collection of tools for obtaining system CPU and memory statistics. It is built on the psutil library. The tools are implemented as FastAPI enpoints and then exposed via MCP using fastapi-mcp.

The Agent is part of a simple Gradio chat application. The agent uses the Pydantic.ai agent framework. The agent is provided the MCP Server's URL and a system prompt indicating that it should answer system resource usage. The Gradio Chat component maintains a conversation history so that you can ask follow-up questions.

Setup

Prerequisites

First make sure you have the following tools installed on your machine:

  1. uv, a package and environment manager for Python
  2. direnv, a tool for managing environment variables in your projects
  3. mcptools (optional), a command line utility for interacting with MCP servers. This program is only needed if you want to test/debug the MCP server without the chat application. It is really helpful for debugging your tools and making sure that the expected metadata is being published by the MCP server. Note that the name of the program is mcpt if you install via Homebrew on Mac and mcptools otherwise.
  4. These examples use OpenAI models for the Agent, so you will need an actve account and key from here. Alternatively, you can use one of the other models supported by Pydantic.ai. In that case, you will have to set the model and key appropriately.

Setup steps

Once you have the prerequisites installed, do the following steps:

  1. Copy envrc.template to .envrc and edit the value of OPENAI_API_KEY to your Open AI token.
  2. Run direnv allow to put the changed environment variables into your environment.
  3. Run uv sync to create/update your virtual environment.
  4. You can start the MCP Server with uv run psutil_mcp.py. By default it will server on port 8000.

Testing

If you have installed mcptools, you can connect to your MCP server and test it as follows:

$ mcptools shell http://localhost:8000/mcp # use the command "mcpt" if you installed via Homebrew
mcp> tools
cpu_times
     Get Cpu Times Return system CPU time as a total across all cpus. Every attribute represents the
...

mcp> call cpu_times
{
  "user": 119528.44,
  "nice": 0.0,
  "system": 67114.2,
  "idle": 2692773.55
}
mcp> exit

Running

To run the full application:

  1. If you have not already stared your MCP Server, you can run it as uv run psutil_mcp.py
  2. In another terminal window, start the chat server with uv run chat.py
  3. Point your browser to http://127.0.0.1:7860

Extras

The psutil_mcp.py and chat.py programs have some command line options to enable debugging, change the model, change the ports, etc. Run them with the --help option to see the available options.

There is a configuration for VSCode to use the MCP server at .vscode/mcp.json