sushimcp
If you are the rightful owner of sushimcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
SushiMCP is a model context protocol server designed to enhance AI IDE performance by delivering context to LLM models.
SushiMCP
SushiMCP is a model context protocol server designed to assist developers with delivering context to their AI IDE's. It's simple to use and massively improves the performance of base and premium LLM models when generating code. The easiest way to get started is by registering SushiMCP with your client using the default configuration:
Registering SushiMCP with an MCP Client
{
"sushimcp": {
"command": "npx",
"args": [
"-y",
"@chriswhiterocks/sushimcp@latest",
"--llms-txt-source",
"cool_project:https://coolproject.dev/llms-full.txt",
"--openapi-spec-source",
"local_api:http://localhost:8787/api/v1/openapi.json"
]
}
}
Advanced Configuration & Deeper Learning
Visit the SushiMCP Docs for more information on advanced configuration and deeper learning about SushiMCP.
Glama.ai Ratings
Author
Chris White: | GitHub | Discord | Personal Site | X | LinkedIn | Five9 Cyber
License
This project is licensed under the AGPL-3.0-or-later. See the license.txt
file for details.