wei/mymlh-mcp-server
If you are the rightful owner of mymlh-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A Model Context Protocol (MCP) server providing secure, OAuth-authenticated access to MyMLH user data.
A Model Context Protocol (MCP) server that provides secure, OAuth-authenticated access to MyMLH. This server enables AI assistants and MCP clients to interact with the MyMLH API on behalf of users.
Features
- Secure Authentication: Implements MyMLH v4 API with OAuth for robust and secure user authentication.
- User Data Access: Provides tools to fetch a user's MyMLH profile, education, employment history, and more.
- Automatic Token Management: Handles token refresh and secure storage automatically.
- Cloudflare Workers: Built to run on the edge for low-latency, scalable performance.
- Easy Deployment: Can be deployed to your own Cloudflare account in minutes.
Quick Start
You can connect to our publicly hosted instance using any MCP client that supports the Remote HTTP transport with OAuth.
Endpoint: https://mymlh-mcp.git.ci/mcp
Install MCP Server
Here are examples for common MCP clients:
VS Code:
{
"servers": {
"mymlh": {
"type": "http",
"url": "https://mymlh-mcp.git.ci/mcp"
}
}
}
Cursor and many clients:
{
"mcpServers": {
"mymlh": {
"url": "https://mymlh-mcp.git.ci/mcp"
}
}
}
Windsurf and many clients:
{
"mcpServers": {
"mymlh": {
"serverUrl": "https://mymlh-mcp.git.ci/mcp"
}
}
}
Augment Code:
{
"mcpServers": {
"mymlh": {
"url": "https://mymlh-mcp.git.ci/mcp",
"type": "http"
}
}
}
Claude Code:
claude mcp add --transport http mymlh https://mymlh-mcp.git.ci/mcp
Gemini CLI:
Gemini currently only supports the deprecated SSE protocol.
{
"mcpServers": {
"mymlh": {
"url": "https://mymlh-mcp.git.ci/sse"
}
}
}
Roo Code, Cline Even though these clients support streamable HTTP, they do not yet support the OAuth authentication flow. You will need to use the fallback option below. See open feature requests for Roo Code, Cline].
For other clients, please consult their documentation for connecting to an MCP server. If you see 401 errors, the client likely does not supports the Remote HTTP transport with OAuth and you will need to use the fallback option below.
Fallback Option
For environments where Remote HTTP with OAuth is not supported, you may fallback to stdio transport with mcp-remote
. This wraps the hosted MCP server into a local stdio interface, forwarding requests over HTTP behind the scenes to ensure compatibility.
Example mcp-remote
configuration snippet:
{
"mcpServers": {
"mymlh": {
"command": "mcp-remote",
"args": [
"https://mymlh-mcp.git.ci/mcp"
]
}
}
}
Available Tools
Once connected and authenticated, you can use the following tools:
- Get User Info: Fetches your complete MyMLH profile.
- Get Token Details: Inspects the details of your current authentication token.
- Refresh Token: Manually refreshes your authentication token.
Testing MCP Inspector
You can test the remote MCP server using the Model Context Protocol Inspector.
- Run the Inspector from your terminal:
npx @modelcontextprotocol/inspector@latest
- Enter the server URL:
https://mymlh-mcp.git.ci/mcp
and click "Connect". - Follow the authentication flow to connect and test the tools.
Testing with Cloudflare AI Playground
You can also test the server directly using the Cloudflare Workers AI LLM Playground.
- Go to the playground link.
- Enter the server URL:
https://mymlh-mcp.git.ci/mcp
- Follow the authentication flow to connect and test the tools.
Example Usage
You can interact with the MyMLH MCP server using natural language in your AI assistant:
- "Get my MyMLH user info."
- "Show me my MyMLH profile."
- "Generate a resume using my MyMLH profile."
- "Create a GitHub profile README using my MyMLH data."
Deploying Your Own Instance
For full control, you can deploy your own instance to Cloudflare. See the for detailed instructions.
Contributing
We welcome contributions! Whether you're fixing a bug, adding a feature, or improving documentation, your help is appreciated.
Please read our to get started.