agent_1

c12k/agent_1

3.1

If you are the rightful owner of agent_1 and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

An AI agent built with Streamlit, LangChain, and MCP tools to query GitHub.

Tools
1
Resources
0
Prompts
0

AI Agent

An agent built with Streamlit, LangChain, and MCP tools that can query github.

GitHub API Setup

To use the GitHub search functionality, you need to set up a GitHub API token:

  1. Create a GitHub Personal Access Token:

  2. Configure the token:

    • Create a .env file in the project root
    • Add your token: GITHUB_TOKEN=your_actual_token_here
    • The .env file is already in .gitignore for security

Docker Setup

To run all components using Docker Compose:

# Build and start all services
docker-compose up --build

# Run in background
docker-compose up -d --build

# Stop all services
docker-compose down

Services

  • Ollama (port 11434): LLM service (for mac run a local install as docker ollama is slow and doesn't use mac gpu)
  • MCP Server (port 3001): Search service (an MCP tool for the AI agent)
  • API Server (port 3002): FastAPI backend (to host the AI agent)
  • Streamlit App (port 8501): Web interface (mostly vibe coded as throwaway prototype to test)

Access Points

First Run

You need to install ollama locally

# For Mac
brew install ollama

# For Linux
curl -fsSL https://ollama.com/install.sh | sh

# For Windows
# Download and install from: https://ollama.com/download/windows

You'll also need to pull a model in Ollama:

# qwen:4b is small enough and has reasoning capabilities
ollama pull qwen:4b