Local-MCP-Client-and-Server

salunkecoep/Local-MCP-Client-and-Server

3.2

If you are the rightful owner of Local-MCP-Client-and-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Model Context Protocol (MCP) server facilitates communication between AI applications and clients, enabling seamless data exchange and processing.

MCP Projects:

Local MCP Client and Server (MCP client, server, and AI application running on the same machine)

Step.1: Install ollama on your machine, use the link below to download ollama

https://ollama.com/download

Step.2: Install LLM on your machine, Open PowerShell and execute below command.

ollama run llama3.2

Step.3: Download code from GitHub using link below

Step.4: Install all needed packages using the command below.

pip install -r requirements.txt

Step.5: Run MCP server using below command

py .\server.py    

Step.6: Run AI application and MCP client in Jupiter notebook.

Step.7: Give prompt to the AI application like How many Employees working in company.