mcp-server-openai
If you are the rightful owner of mcp-server-openai and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
OpenAI MCP Server allows querying OpenAI models directly using the MCP protocol, supporting o3-mini and gpt-4o-mini models.
The OpenAI MCP Server is a protocol server that facilitates direct interaction with OpenAI models using the Model Context Protocol (MCP). This server is designed to integrate seamlessly with Claude, providing users with the ability to query OpenAI's models, specifically the o3-mini and gpt-4o-mini, which are optimized for concise and detailed responses respectively. The server supports configurable message formatting, error handling, and logging, ensuring a robust and user-friendly experience. Installation can be done manually or via Smithery, and requires Python 3.10 or higher, along with an OpenAI API key. The server's primary tool, 'ask-openai', allows users to pose questions to the models, with responses formatted in a standardized JSON structure. The server is designed to be easily integrated into existing MCP settings, making it a versatile tool for developers and users looking to leverage OpenAI's capabilities.
Features
- Direct integration with OpenAI's API
- Support for multiple models: o3-mini and gpt-4o-mini
- Configurable message formatting
- Error handling and logging
- Simple interface through MCP protocol
Tools
ask-openai
Ask questions directly to the OpenAI assistant model