Task-Manager

WajeehAhmed/Task-Manager

3.2

If you are the rightful owner of Task-Manager and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Task-Manager is an AI-enabled task management copilot designed to streamline task organization with human oversight.

Tools
  1. propose_task

    Proposes task details based on user input.

  2. create_task

    Creates a task in the database after user confirmation.


Task-Manager: AI-Enabled Task Management Copilot

Overview

The Task-Manager is an advanced AI-enabled task management copilot designed to streamline and intelligentize your task organization. This project consists of a Python-based backend that exposes task management functionalities as modular AI tools using FastMCP, a PostgreSQL database for persistent storage, and a Python client that leverages LangChain and LangGraph to provide a conversational interface for interacting with these tools. The setup is orchestrated using Docker Compose for easy database deployment.

The copilot allows users to interact naturally to propose and create tasks, with the AI guiding the process and critically involving a human in the loop to confirm actions before they are executed.

Flow

graph TD
    User[User] -->|Natural Language Input| Client["Conversational AI Client (LangChain/LangGraph)"]
    Client -->|"Tool Call Request (to FastMCP endpoint)"| AIToolServer["AI Tool Server (FastMCP/FastAPI)"]
    AIToolServer -->|SQLAlchemy/Psycopg2| Database[PostgreSQL Database]
    Database -->|Task Data| AIToolServer
    AIToolServer -->|"Tool Output (e.g., Proposed Task JSON)"| Client
    Client -->|"Confirmation Prompt (triggered by tool output)"| User
    User -->|'Yes' or 'No'| Client
    Client --o|"Conditional Tool Call (e.g., create_task)"| AIToolServer
    AIToolServer --o|"Database Write (if confirmed)"| Database
    AIToolServer -->|Success/Error Message| Client
    Client -->|AI Response| User

Features

  • Intelligent Task Proposing with Human Confirmation: The AI can understand natural language requests to propose task details (name, description, due date). Crucially, it then presents these proposed details to the user for explicit confirmation before the task is formally created in the database, ensuring that the human user maintains control over task creation.

  • Persistent Task Storage: All tasks are stored securely in a PostgreSQL database, ensuring data persistence across sessions.

  • Modular AI Tools: Task management operations (like propose_task and create_task) are exposed as callable tools using FastMCP, making the system extensible.

  • Conversational Interface: The client uses LangChain's create_react_agent and LangGraph to enable a natural, turn-based conversation with the AI for task management.

  • Database Migration & Initialization: The backend automatically waits for the database to be ready and creates necessary tables upon startup.

  • Dockerized Database: Easy deployment and management of the PostgreSQL database using docker-compose.

  • Environment Variable Configuration: Sensitive information and configurations are managed via .env files for both the server and client.

Technologies Used

Backend & Database

  • Python: Core programming language.

  • FastAPI: High-performance web framework (underneath FastMCP for tool serving).

  • FastMCP: A framework for exposing Python functions as modular AI tools.

  • SQLAlchemy: Python SQL toolkit and Object Relational Mapper (ORM) for database interactions.

  • Psycopg2-binary: PostgreSQL adapter for Python.

  • python-dotenv: For loading environment variables.

  • Uvicorn: ASGI server for running the FastAPI application.

  • PostgreSQL: Relational database for storing task data.

AI & Client

  • LangChain: Framework for developing applications powered by language models.

  • LangGraph: Library for building robust and stateful multi-actor applications with LLMs.

  • LangChain-mcp-adapters: Adapter for integrating LangChain with FastMCP tools.

  • LangChain-OpenAI: Integration for OpenAI models.

Project Structure

The project is divided into two main components:

  • server/: Contains the backend application responsible for database interactions and exposing AI tools.

    • database.py: Defines the SQLAlchemy ORM model for tasks and handles database connection and table creation.

    • toolkit.py: Defines the AI tools (e.g., propose_task, create_task) using FastMCP decorators.

    • server.py: The main entry point for the backend server, which initializes the database and serves the MCP tools.

  • client/: Contains the interactive client application.

    • main.py: The client script that connects to the MCP server, loads the tools, and facilitates a conversational interaction with the AI model using LangChain/LangGraph.

Setup and Installation

Follow these steps to get the Task-Manager running on your local machine:

Prerequisites

  • Python 3.9+

  • pip (Python package installer)

  • Docker

  • Docker Compose

Step-by-Step Guide

  1. Clone the Repository:

    git clone https://github.com/WajeehAhmed/Task-Manager.git
    cd Task-Manager
    
  2. Database Setup (Docker Compose): Start the PostgreSQL database container. This will also create a persistent volume for your database data.

    docker-compose up -d db
    

    Verify the database container is running: docker ps

  3. Server Setup:

    a. Create Python Virtual Environment:

    python -m venv venv-server
    source venv-server/bin/activate # On macOS/Linux
    .\venv-server\Scripts\activate   # On Windows
    

    b. Install Server Dependencies:

    pip install -r requirements.txt
    

    c. Create .env file for Server: In the root directory of your project, create a file named .env and add the following database connection details. Important: When running the server directly on your host machine (outside a Docker container in the same network as db), use localhost for POSTGRES_HOST.

    POSTGRES_USER=user
    POSTGRES_PASSWORD=password
    POSTGRES_HOST=localhost # Use 'db' if running server inside docker-compose network
    POSTGRES_PORT=5432
    POSTGRES_DB=taskdb
    

    d. Run the Server: The server will automatically wait for the database to be ready and create the necessary tables.

    python server/server.py
    

    The FastMCP server will start, typically listening on http://localhost:8000/mcp/.

  4. Client Setup:

    a. Create Python Virtual Environment (Optional, can use same as server): It's good practice to keep client dependencies separate, but for this project, requirements.txt serves both.

    python -m venv venv-client
    source venv-client/bin/activate # On macOS/Linux
    .\venv-client\Scripts\activate   # On Windows
    

    b. Install Client Dependencies: If you created a new environment, install dependencies again:

    pip install -r requirements.txt
    

    c. Create .env file for Client: In the root directory, update or create your .env file with your OpenAI API keys and base URL.

    API_KEY="YOUR_OPENAI_API_KEY"
    BASE_URL="YOUR_OPENAI_API_BASE_URL" # e.g., https://api.openai.com/v1
    

    Note: Replace "YOUR_OPENAI_API_KEY" and "YOUR_OPENAI_API_BASE_URL" with your actual OpenAI credentials.

    d. Run the Client:

    python client/main.py
    

Usage

Once both the server and client are running:

  1. Interact with the AI: The client/main.py script will prompt you to type your messages.

  2. Proposing Tasks: You can describe a task, and the AI will use the propose_task tool to format the details for you.

    • Example: USER: I need to buy groceries by tomorrow.

    • The AI might respond with a proposed task: AI: {"name": "buy groceries", "description": "No description provided.", "due_date": "2024-XX-XX"} (actual date would be tomorrow).

  3. Confirming Task Creation (Human-in-the-Loop) : After a task is proposed, the AI will explicitly ask for your confirmation. This is the human-in-the-loop step, giving you control before any changes are made to your tasks.

    • AI: Do you want to create this task? (yes/no):

    • Type yes or y to confirm, and the create_task tool will be invoked to save it to the database.

    • Example: USER: Yes, please create the task.

  4. Exiting: Type exit or quit to stop the client chat.

Contributing

Contributions are highly welcome! If you'd like to contribute to the Task-Manager, please follow these steps:

  1. Fork the repository.

  2. Create a new branch for your feature or bug fix (git checkout -b feature/your-feature-name).

  3. Commit your changes (git commit -m 'feat: Add new task listing tool' or fix: Resolve task creation bug).

  4. Push to your branch (git push origin feature/your-feature-name).

  5. Open a Pull Request describing your changes.