Newform-AI/tiktok-api-docs-mcp
If you are the rightful owner of tiktok-api-docs-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The TikTok API Docs MCP Server is a Model Context Protocol server designed to provide semantic search and retrieval capabilities for TikTok API documentation using OpenAI's vector store.
TikTok API Docs MCP Server
An MCP (Model Context Protocol) server that provides semantic search and retrieval for TikTok API documentation using OpenAI's vector store. Compatible with ChatGPT connectors, deep research, and API integrations.
๐ฏ Purpose
This server enables AI models to search and retrieve TikTok API documentation through the Model Context Protocol. It's designed to work with:
- ChatGPT Connectors for enhanced chat capabilities
- Deep Research models (o4-mini-deep-research)
- Any MCP-compatible client
โจ Features
- Semantic Search: Search TikTok API documentation using natural language queries
- Document Retrieval: Fetch full documentation content by ID
- Vector Store Integration: Powered by OpenAI's vector store for accurate semantic search
- Dual Transport Support: Run via stdio (local) or SSE/HTTP (remote)
- ChatGPT Compatible: Implements the required
search
andfetch
tools for ChatGPT integration
๐ Prerequisites
- OpenAI API Key: Required for vector store operations
- Bun Runtime: This project uses Bun for optimal performance
- TikTok Documentation: Automatically downloaded during setup
๐ Quick Start
1. Install Dependencies
bun install
2. Set up OpenAI API Key
export OPENAI_API_KEY=sk-your-api-key-here
3. Initialize Vector Store with TikTok Docs
# This downloads TikTok docs and uploads them to OpenAI vector store
bun run src/scripts/tikTokDocsToVectorStore.ts
This will:
- Download all TikTok API documentation (~200+ files)
- Create an OpenAI vector store named "TikTok API Documentation"
- Upload and index all documentation
- Save configuration to
tiktok-docs/vector-store-config.json
4. Start the MCP Server
For HTTP/SSE mode (ChatGPT and remote access):
bun run start:http
For stdio mode (local MCP clients):
bun run start
๐ ๏ธ Available Tools
search
Search TikTok API documentation for relevant information.
Parameters:
query
(string): Search query
Returns:
{
"results": [
{
"id": "file_123",
"title": "Campaign Management",
"text": "Relevant snippet...",
"url": "https://platform.tiktok.com/docs/campaign-management"
}
]
}
fetch
Retrieve full content of a documentation file.
Parameters:
id
(string): File ID from search results
Returns:
{
"id": "file_123",
"title": "Campaign Management",
"text": "Full document content...",
"url": "https://platform.tiktok.com/docs/campaign-management",
"metadata": {...}
}
vector_store_status
Check vector store configuration status.
Returns:
{
"configured": true,
"store_id": "vs_abc123",
"message": "Vector store is configured and ready"
}
๐ Integration with ChatGPT
Via ChatGPT Connectors
- Go to ChatGPT Settings โ Connectors
- Add new MCP server:
- URL:
https://your-server-url/sse/
- Tools:
search
,fetch
- Approval: Set to "never" for deep research
- URL:
Via OpenAI API
curl https://api.openai.com/v1/responses \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "o4-mini-deep-research",
"input": [{
"role": "user",
"content": [{
"type": "input_text",
"text": "How do I create a TikTok ad campaign?"
}]
}],
"tools": [{
"type": "mcp",
"server_label": "tiktok-docs",
"server_url": "http://localhost:3001/sse/",
"allowed_tools": ["search", "fetch"],
"require_approval": "never"
}]
}'
๐ Project Structure
โโโ src/
โ โโโ core/
โ โ โโโ services/
โ โ โ โโโ vector-store-service.ts # OpenAI vector store operations
โ โ โ โโโ greeting-service.ts # Legacy example service
โ โ โโโ tools.ts # MCP tool definitions
โ โ โโโ resources.ts # MCP resources
โ โ โโโ prompts.ts # MCP prompts
โ โโโ scripts/
โ โ โโโ tikTokDocsToVectorStore.ts # Setup script for vector store
โ โ โโโ getTikTokDocsMd.ts # TikTok docs downloader
โ โโโ server/
โ โ โโโ http-server.ts # SSE/HTTP server
โ โ โโโ server.ts # Core server setup
โ โโโ index.ts # stdio server entry
โโโ tiktok-docs/ # Downloaded documentation (gitignored)
โ โโโ vector-store-config.json # Vector store configuration
โโโ MCP_SERVER_README.md # Detailed MCP server documentation
โโโ README.md # This file
๐งช Testing
Test the Server
# Run the test script
node test-mcp-server.js
Manual Testing
Search for documentation:
curl -X POST http://localhost:3001/sse/ \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "tools/call",
"params": {
"name": "search",
"arguments": {"query": "campaign creation"}
}
}'
๐ Scripts
Vector Store Management
# Download docs and sync to vector store
bun run src/scripts/tikTokDocsToVectorStore.ts
# Search the vector store
bun run src/scripts/tikTokDocsToVectorStore.ts --search "your query"
Server Commands
# Production
bun run start # stdio mode
bun run start:http # HTTP/SSE mode
# Development (with auto-reload)
bun run dev # stdio mode
bun run dev:http # HTTP/SSE mode
# Build
bun run build # Build stdio server
bun run build:http # Build HTTP server
๐ง Configuration
Environment Variables
OPENAI_API_KEY=sk-... # Required: OpenAI API key
PORT=3001 # HTTP server port (default: 3001)
Vector Store Configuration
After running the setup script, configuration is saved to:
{
"vectorStoreId": "vs_abc123",
"vectorStoreName": "TikTok API Documentation",
"lastSync": "2024-01-01T00:00:00Z",
"filesCount": 200
}
๐ข Deployment
Using PM2
# Install PM2
npm install -g pm2
# Start the server
pm2 start bun --name "tiktok-mcp" -- run start:http
# Save PM2 configuration
pm2 save
pm2 startup
Using Docker
FROM oven/bun:latest
WORKDIR /app
COPY package.json bun.lockb ./
RUN bun install
COPY . .
ENV PORT=3001
EXPOSE 3001
CMD ["bun", "run", "start:http"]
๐ Security Considerations
- API Keys: Never commit API keys to version control
- HTTPS: Use HTTPS in production environments
- Authentication: Implement authentication for public deployments
- Rate Limiting: Consider implementing rate limiting for API endpoints
- CORS: Configure appropriate CORS headers for your use case
๐ Documentation
- - Detailed MCP implementation guide
- Model Context Protocol - Official MCP documentation
- FastMCP Framework - Framework documentation
- OpenAI Vector Stores - Vector store API guide
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
๐ License
This project is licensed under the MIT License - see the file for details.
๐ Acknowledgments
- Built with FastMCP
- Powered by OpenAI Vector Stores
- TikTok API documentation from TikTok for Business