apo1397/mcp-server-influencer-search
If you are the rightful owner of mcp-server-influencer-search and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project demonstrates an MCP Server designed to find influencers using the Modash and Gemini APIs.
get_influencers_from_modash
Fetches influencers based on specified criteria using the Modash API.
get_location_id
Retrieves location IDs for influencer search queries.
MCP Server Demo
This project demonstrates an MCP (Multi-Cloud Platform) Server designed to find influencers based on various criteria using the Modash API and Gemini API for natural language query processing.
What is an MCP Server?
An MCP Server is a specialized server that exposes a set of tools (functions) that can be called by an AI agent or other clients. It acts as an extension of the agent's capabilities, allowing it to perform complex tasks by leveraging external APIs and custom logic. In this project, our MCP Server provides a tool to search for influencers.
Project Setup
To set up and run this MCP Server, follow these steps:
1. Prerequisites
- Python 3.11 or higher
uv
(a fast Python package installer and runner)- API Keys for Modash and Gemini (Google Generative AI)
2. Installation
-
Clone the repository:
git clone <repository_url> cd mcp-server-influencer-search
-
Install dependencies using
uv
:uv pip install main.py
This will install
mcp[cli]
,google-generativeai
,google-api-core
, andrequests
as defined in themain.py
file.
3. Environment Variables:
Create a `.env` file in the root directory of the project and add your API keys:
```dotenv
MODASH_TOKEN="your_modash_api_key"
GEMINI_API_KEY="your_gemini_api_key"
```
Replace `your_modash_api_key` and `your_gemini_api_key` with your actual API keys.
4. Running the MCP Server
Manual Run (for testing)
You can run the MCP server manually from your terminal:
cd <project_directory>
/opt/homebrew/bin/uv run main.py
This will start the MCP server, making its tools available.
MCP Server Configuration for AI IDEs (Trae, Cursor, Claude Code)
To integrate this MCP Server with AI IDEs like Trae, Cursor, or Claude Code, you need to configure them to recognize and connect to your server. This typically involves creating a .claude.json
(or similar) file in your project's root directory.
Here's an example of how to configure it for Claude Code (the structure might be similar for Trae and Cursor):
.claude.json
example:
{
"mcpServers": {
"Influencer-Search": {
"command": "/opt/homebrew/bin/uv",
"args": [
"run",
"<path_to_your_project>/main.py"
],
"env": {
"MODASH_TOKEN": "${MODASH_TOKEN}",
"GEMINI_API_KEY": "${GEMINI_API_KEY}"
}
}
}
}
Explanation of the configuration:
name
: A unique name for your MCP server (e.g., "Influencer-Search"). This is how the AI agent will refer to your server.command
: The executable command to run your server. In this case, it's theuv
executable.args
: A list of arguments passed to thecommand
. This specifies howuv
should run yourmain.py
as an MCP server.cwd
: The current working directory where the command will be executed. This is crucial for the.env
file to be picked up correctly.env
: An object containing environment variables that will be set when the server starts. It's recommended to reference variables from your.env
file using${VAR_NAME}
syntax for security and maintainability.
Important Notes for IDE Integration:
- Restart IDE: After creating or modifying the
.claude.json
file (or equivalent configuration), you will likely need to restart your AI IDE (Trae, Cursor, Claude Code) for the changes to take effect. - Environment Variable Loading: Ensure your IDE is configured to load environment variables from your
.env
file. Thecwd
setting in the MCP server configuration helps with this. - Tool Discovery: Once the server is running and configured, the AI IDE should automatically discover the tools exposed by your
main.py
(e.g.,get_influencers_from_modash
,get_location_id
) and make them available for use.