just-prompt

just-prompt

3.7

If you are the rightful owner of just-prompt and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Just Prompt is a lightweight MCP server providing a unified interface to various LLM providers.

Just Prompt is a Model Control Protocol (MCP) server designed to offer a unified interface for interacting with multiple Large Language Model (LLM) providers such as OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It simplifies the process of sending prompts to these models and retrieving responses, supporting both text strings and file-based prompts. Just Prompt also includes advanced features like running multiple models in parallel, automatic model name correction, and saving responses to files. It is particularly useful for users who need to work with different LLM providers without having to manage separate integrations for each. The server is equipped with tools to list available providers and models, making it easier to manage and utilize the capabilities of various LLMs.

Features

  • Unified API for multiple LLM providers
  • Support for text prompts from strings or files
  • Run multiple models in parallel
  • Automatic model name correction using the first model in the --default-models list
  • Ability to save responses to files

Tools

  1. prompt

    Send a prompt to multiple LLM models

  2. prompt_from_file

    Send a prompt from a file to multiple LLM models

  3. prompt_from_file_to_file

    Send a prompt from a file to multiple LLM models and save responses as markdown files

  4. ceo_and_board

    Send a prompt to multiple 'board member' models and have a 'CEO' model make a decision based on their responses

  5. list_providers

    List all available LLM providers

  6. list_models

    List all available models for a specific LLM provider