ShashidharNagaral/mcp-fs
If you are the rightful owner of mcp-fs and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP-FS is a prototype toolhost that allows large language models (LLMs) to perform basic file system operations through the Model Context Protocol (MCP) over HTTP.
MCP-FS (Model Context Protocol – File System)
A prototype toolhost that enables LLMs to perform basic file system operations through MCP over HTTP.
Features
createFile– Create new filesreadFile– Read file contentupdateFile– Overwrite file contentappendToFile– Append text to a filedeleteFile– Delete fileslistFiles– List contents of a directorydescribeServer– Returns server tool list & usage guide
Architecture
- Server: Exposes tools via MCP HTTP transport using Express and
@modelcontextprotocol/sdk. - Client: Connects to MCP, fetches tool list, routes user input to an LLM (e.g., Ollama), and invokes tools based on LLM output.
Install & Run Ollama
MCP-FS uses Ollama to run local LLMs and respond to file system tool requests.
Ollama allows you to run models like mistral, llama2, codellama, and others locally via a simple API.
Official GitHub: ollama/ollama
Install Ollama
Start Ollama Server
ollama serve
By default, Ollama runs at http://localhost:11434.
Pull Required Model
ollama pull mistral-nemo
Check Installation
Should list available models (confirming Ollama is up and the model is installed):
curl http://localhost:11434/api/tags
Installation & Setup Project
Clone the repo
git clone https://github.com/your-username/mcp-fs.git
cd mcp-fs
Install dependencies
npm install
Start the MCP Server
npm run server
Start the MCP Client
npm run client
Resources
- Ollama Website: https://ollama.com
- Ollama GitHub: https://github.com/ollama/ollama
- Model Library: https://ollama.com/library