dario-suckfuell/pinecone-mcp-server
3.1
If you are the rightful owner of pinecone-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Pinecone MCP Server is a minimal remote MCP server with SSE transport designed for OpenAI Deep Research.
Pinecone MCP Server
Minimal remote MCP server with SSE transport for OpenAI Deep Research.
What You Need
- Python 3.10+
- Pinecone index (1536 dimensions)
- OpenAI API key
Setup
# Install (requires Python 3.10+)
pip install -r requirements.txt
# Test tools work
python test_tools.py
# Run server
python pinecone_mcp.py
Configuration
Edit env file with your keys:
PINECONE_API_KEY=your_key
PINECONE_INDEX=your_index
PINECONE_HOST=your_host
OPENAI_API_KEY=your_key
Deploy to Railway
- Push to GitHub
- Connect on railway.app
- Add environment variables from
envfile - Deploy
Use with Deep Research
from openai import OpenAI
client = OpenAI()
response = client.chat.completions.create(
model="o4-mini-deep-research",
messages=[{"role": "user", "content": "Search my database"}],
tools=[{"type": "mcp", "mcp": {"url": "https://your-server.railway.app"}}]
)
Files
pinecone_mcp.py- Main server (search + fetch tools)test_tools.py- Test your setup worksenv- Configurationrequirements.txt- DependenciesProcfile- Deploy config
That's it. See GUIDE.md for more details if needed.