Phicks-debug/linkedin-web-scrapper-mcp-server
If you are the rightful owner of linkedin-web-scrapper-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
LinkedIn Web Scraper MCP Server provides LinkedIn web scraping capabilities as tools for AI assistants using the Model Context Protocol (MCP).
LinkedIn Web Scraper MCP Server
A Model Context Protocol (MCP) server that provides LinkedIn web scraping capabilities as tools for AI assistants. This server uses Playwright to automate LinkedIn people search and extract profile information, exposing these capabilities through the MCP protocol.
Features
- MCP Tool Integration: Exposes LinkedIn scraping as MCP tools for AI assistants
- People Search: Search LinkedIn profiles using keywords, location, and network filters
- Profile Extraction: Extract profile names, URLs, and headlines from search results
- Session Management: Automatic LinkedIn login with cookie persistence
- Adaptive Selectors: Handles LinkedIn UI changes with multiple CSS selector strategies
- Network Filtering: Filter by connection degree (1st, 2nd, 3rd+ connections)
- Location Support: Filter by location using LinkedIn's geoUrn codes or location strings
Installation
- Clone the repository:
git clone https://github.com/Phicks-debug/linkedin-web-scrapper.git
cd linkedin-web-scrapper-mcp-server
- Install dependencies:
npm install
- Install Playwright browsers:
npx playwright install
- Configure your LinkedIn credentials:
cp config.example.json config.json
Then edit config.json
with your LinkedIn credentials:
{
"linkedin": {
"email": "your-linkedin-email@email.com",
"password": "your-linkedin-password"
},
"browser": {
"headless": false,
"slowMo": 1000,
"cookiesPath": "./cookies.json"
}
}
- Build the server:
npm run build
Usage
As an MCP Server
This server is designed to be used with MCP-compatible AI assistants. The server exposes LinkedIn scraping functionality through the MCP protocol.
Starting the MCP Server
# Start the server (connects via stdio)
node dist/index.js
# For development with auto-rebuild
npm run watch
Using MCP Inspector (Development)
Test the server using the MCP Inspector:
npm run inspector
Available MCP Tools
search-linkedin-people
Search for LinkedIn profiles using web scraping.
Input Schema:
{
"keywords": "software engineer", // Required: Keywords to search for
"location": "105646813", // Optional: Location filter (geoUrn or location string)
"network": "F" // Optional: Network degree filter
}
Network Filter Options:
"F"
- 1st degree connections only"S"
- 2nd degree connections"O"
- 3rd+ degree connections (out of network)
Location Examples:
"105646813"
- Spain (using LinkedIn geoUrn)"San Francisco"
- Location string- Default:
"104195383"
if not specified
Response Format:
{
"success": true,
"count": 10,
"profiles": [
{
"name": "John Doe",
"profileUrl": "https://www.linkedin.com/in/johndoe",
"headline": "Senior Software Engineer at Tech Company"
}
],
"filters": {
"keywords": "software engineer",
"location": "105646813",
"network": "F"
}
}
MCP Integration
Adding to Claude Desktop
Add this server to your Claude Desktop MCP configuration:
{
"mcpServers": {
"linkedin": {
"command": "node",
"args": ["/path/to/linkedin-web-scrapper-mcp-server/dist/index.js"],
"cwd": "/path/to/linkedin-web-scrapper-mcp-server"
}
}
}
Using with Other MCP Clients
The server follows the standard MCP protocol and can be used with any MCP-compatible client by connecting to the stdio transport.
How It Works
- MCP Protocol: Exposes LinkedIn scraping as standardized MCP tools
- Browser Automation: Uses Playwright to control Chrome/Chromium browser
- Session Persistence: Saves LinkedIn session cookies to avoid repeated logins
- People Search: Navigates to LinkedIn people search with specified filters
- Profile Extraction: Extracts profile data using adaptive CSS selectors
- Structured Output: Returns JSON-formatted results via MCP protocol
Development
Scripts
Script | Description |
---|---|
npm run build | Compile TypeScript and make executable |
npm run watch | Watch mode for development |
npm run inspector | Launch MCP Inspector for testing |
npm run dev | Build and run the server |
Project Structure
āāā index.ts # Main MCP server implementation
āāā config.json # LinkedIn credentials and browser settings
āāā cookies.json # Saved session cookies (auto-generated)
āāā package.json # MCP server configuration
āāā dist/ # Compiled JavaScript output
Security & Privacy
- Local Credentials: Your LinkedIn credentials are stored locally in
config.json
- Session Cookies: Saved locally in
cookies.json
for session persistence - No Data Transmission: No data is sent anywhere except to LinkedIn for scraping
- Browser Automation: Uses a visible browser window to avoid detection
Technical Details
- Protocol: Model Context Protocol (MCP) 0.6.0
- Runtime: Node.js with TypeScript
- Browser Engine: Playwright with Chromium
- Transport: Standard I/O (stdio) for MCP communication
- Target: LinkedIn People Search API
Error Handling
The server handles common scenarios:
- Automatic LinkedIn login when session expires
- LinkedIn security challenges (requires manual intervention)
- UI changes through adaptive selectors
- Network timeouts and connection issues
Limitations
- LinkedIn Terms: Use responsibly and respect LinkedIn's terms of service
- Rate Limiting: Avoid excessive requests to prevent detection
- Manual Challenges: Security challenges require manual completion
- UI Dependencies: May need updates if LinkedIn significantly changes their UI
License
MIT License - see LICENSE file for details.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test with MCP Inspector
- Submit a pull request
For issues and feature requests, please use the GitHub issues page.