okta-mcp-server

okta-mcp-server

3.4

If you are the rightful owner of okta-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Okta MCP Server is a tool that allows AI models to interact with Okta environments using the Model Context Protocol (MCP), designed for IAM engineers, security teams, and Okta administrators.

Okta MCP Server (v0.1.0-BETA)

**πŸš€ NEW in v0.1.0:** **Major Architecture Upgrade with FastMCP 2.0!**

The Okta MCP Server is a groundbreaking tool that enables AI models to interact directly with your Okta environment using the Model Context Protocol (MCP). Built specifically for IAM engineers, security teams, and Okta administrators, it implements the MCP specification to transform how AI assistants can help manage and analyze Okta resources.

Quick Demo

πŸŽ‰ What's New in v0.1.0-BETA - Major Architecture Transformation!

This release represents a complete architectural overhaul with these key improvements:

πŸ—οΈ Core Architecture

  • πŸš€ FastMCP 2.0: Migrated from legacy MCP package to FastMCP 2.0 for cutting-edge protocol features
  • 🧹 Cleaner Code: Removed tool_registry.py dependency for simpler, more maintainable codebase
  • ⚑ Better Performance: Modern async patterns and optimized request handling

πŸ› οΈ Enhanced Tools

  • πŸ“ Complete Rewrite: All tools rewritten with better annotations and descriptions for AI understanding
  • πŸ›‘οΈ Improved Validation: Enhanced error handling and input validation across all operations

🌐 Unified Client

  • 🎯 Single CLI: New unified command-line client supporting multiple transports (STDIO, HTTP)
  • βš™οΈ Simplified Setup: Streamlined configuration with intelligent defaults

πŸ” Advanced Security

  • 🎫 Bearer Tokens: Full JWT bearer token support with jwks_uri validation
  • 🏒 Enterprise Auth: Support for enterprise authentication flows and scope-based access

πŸš€ Future-Ready

  • 🎯 Sampling Support: Foundation for advanced request sampling capabilities
  • πŸ”Œ Middleware Ready: Extensible middleware system for custom processing
  • πŸ“‘ Protocol Evolution: Access to latest MCP features as they're developed and standardized

πŸ“‹ Table of Contents

Β 

πŸ” What is the Model Context Protocol?

The Model Context Protocol (MCP) is an open standard that enables AI models to interact with external tools and services in a structured, secure way. It provides a consistent interface for AI systems to discover and use capabilities exposed by servers, allowing AI assistants to extend their functionality beyond their training data.

Think of MCP as the "USB-C of AI integration" - just as USB-C provides a universal standard that allows various devices to connect and communicate regardless of manufacturer, MCP creates a standardized way for AI models to discover and interact with different services without custom integration for each one. This "plug-and-play" approach means developers can build tools once and have them work across multiple AI assistants, while users benefit from seamless integration without worrying about compatibility issues.

Example: "Find all locked users in our Okta tenant, and create a spreadsheet in our IT Operations folder on Google Drive with their names, email addresses, and last login dates." The AI uses Okta MCP Server to query locked users, then passes this data to Google Drive MCP Server to create the spreadsheet - all without custom coding.

⚠️ IMPORTANT: Security & Limitations

Please read this section carefully before using Okta MCP Server.

πŸ”„ Data Flow & Privacy

When you make a request, the interaction happens directly between the LLM and the Okta MCP tools - the client application is no longer in the middle. All data returned by these tools (including complete user profiles, group memberships, etc.) is sent to and stored in the LLM's context during the entire transaction for that conversation.

Key Privacy Considerations:

  • The LLM (Claude, GPT, etc.) receives and processes all Okta data retrieved by the tools
  • This data remains in the LLM's context for the duration of the conversation
  • You must be comfortable with your Okta user data being processed by the LLM provider's systems
  • Before using these tools, ensure you're comfortable with Okta data being sent to the AI model's servers

πŸ“Š Context Window Limitations

MCP is designed for lightweight workflows similar to Zapier, not bulk data operations.

Recommendation: Limit requests to fewer than 100 entities per transaction. Avoid operations that require fetching large datasets or multiple API calls.

Examples:

❌ Avoid these types of requests:

  • "Fetch all 10,000 users from our Okta tenant and analyze their login patterns"
  • "Find users who do not have Okta Verify enrolled as a factor"

βœ… Better approaches:

  • "Get the most recently created 20 users"
  • "Find users who haven't logged in for 90+ days, limit to first 50 results"

πŸ’‘ For larger data sets and complex queries: Consider using the Okta AI Agent for larger queries and data sets, The agent is being enhanced with similar "actionable" features to handle larger datasets and more complex scenarios in the very near future.

🚨 HTTP Transport Security Warning

The HTTP transport modes (both Streamable HTTP and SSE) have significant security risks:

  • They open unauthenticated HTTP servers with full access to your Okta tenant
  • No authentication or authorization is provided
  • Anyone who can reach the network port can issue commands to your Okta environment
  • EXTREMELY DANGEROUS when using remote HTTP access via mcp-remote

Best Practice: Only use the STDIO transport method (default mode) unless you have specific security controls in place and understand the risks.

πŸ› οΈ Available Tools

The Okta MCP Server currently provides the following tools:

User Management

  • list_okta_users - Retrieve users with filtering, search, and pagination options
  • get_okta_user - Get detailed information about a specific user by ID or login
  • list_okta_user_groups - List all groups that a specific user belongs to
  • list_okta_user_applications - List all application links (assigned applications) for a specific user
  • list_okta_user_factors - List all authentication factors enrolled for a specific user

Group Operations

  • list_okta_groups - Retrieve groups with filtering, search, and pagination options
  • get_okta_group - Get detailed information about a specific group
  • list_okta_group_members - List all members of a specific group
  • list_okta_assigned_applications_for_group - List all applications assigned to a specific group

Application Management

  • list_okta_applications - Retrieve applications with filtering, search, and pagination options
  • list_okta_application_users - List all users assigned to a specific application
  • list_okta_application_group_assignments - List all groups assigned to a specific application

Policy & Network Management

  • list_okta_policy_rules - List all rules for a specific policy with detailed conditions and actions
  • get_okta_policy_rule - Get detailed information about a specific policy rule
  • list_okta_network_zones - List all network zones with IP ranges and configuration details

System Log Events

  • get_okta_event_logs - Retrieve Okta system log events with time-based filtering and search options

Date & Time Utilities

  • get_current_time - Get current UTC time in ISO 8601 format
  • parse_relative_time - Convert natural language time expressions to ISO 8601 format

Additional tools for applications, factors, policies, and more advanced operations are on the roadmap and will be added in future releases.

πŸš€ Quick Start

Prerequisites

βœ… Python 3.8+ installed on your machine
βœ… Okta tenant with appropriate API access
βœ… An MCP-compatible AI client (Claude Desktop, Microsoft Copilot Studio, etc.)

⚠️ Important Model Compatibility Note:
Not all AI models work with this MCP server. Testing has only been performed with:

  • GPT-4.0
  • Claude 3.7 Sonnet
  • Google-2.5-pro

You must use latest model versions that explicitly support tool calling/function calling capabilities. Older models or models without tool calling support will not be able to interact with the Okta MCP Server.

🧠 Supported AI Providers

The Okta MCP Server supports multiple AI providers through its flexible configuration system. This allows you to connect to various large language models based on your specific needs and existing access.

Currently Supported Providers:

ProviderEnvironment VariableDescription
OpenAIAI_PROVIDER=openaiConnect to OpenAI API with models like GPT-4o. Requires an OpenAI API key.
Azure OpenAIAI_PROVIDER=azure_openaiUse Azure-hosted OpenAI models with enhanced security and compliance features.
AnthropicAI_PROVIDER=anthropicConnect to Anthropic's Claude models (primarily tested with Claude 3.7 Sonnet).
Google Vertex AIAI_PROVIDER=vertex_aiUse Google's Gemini models via Vertex AI. Requires Google Cloud service account.
OpenAI CompatibleAI_PROVIDER=openai_compatibleConnect to any OpenAI API-compatible endpoint, such as Fireworks.ai, Ollama, or other providers that implement the OpenAI API specification.

Installation

# Clone the repository
git clone https://github.com/fctr-id/okta-mcp-server.git
cd okta-mcp-server

# Create and activate a virtual environment
python -m venv venv
source venv/bin/activate  # On Windows use: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

⚠️ NOTICE: If you clone this repository anew or pull updates, always make sure to re-run pip install -r requirements.txt to ensure all dependencies are up-to-date.

Configuration & Usage

Create a config file with your Okta settings:

To use the command line client (no memory), use the instructions below

# Copy the sample config
cp .env.sample .env

# Edit the env with your settings
# Required: Okta domain and API token and LLM settings

cd clients
python mcp-cli-stdio-client.py

To use MCP hosts like Claude Code, vsCode ...etc find the json config below

Supported Transports and Launching

The Okta MCP Server supports multiple transport protocols:

1. Standard I/O (STDIO) - Recommended
  • Security: βœ… Direct communication through standard input/output streams
  • Use case: Ideal for desktop AI assistants like Claude Desktop
  • Performance: βœ… Lightweight and efficient
  • Configuration: For Claude Desktop, add to claude_desktop_config.json:
    {
      "mcpServers": {
        "okta-mcp-server": {
          "command": "DIR/okta-mcp-server/venv/Scripts/python",
          "args": [
            "DIR/okta-mcp-server/main.py"
          ],
          "env": {
            "OKTA_CLIENT_ORGURL": "https://dev-1606.okta.com",
            "OKTA_API_TOKEN": "OKTA_API_TOKEN"
          }
        }
      }
    }
    
    Replace DIR with your absolute directory path and OKTA_API_TOKEN with your actual token
2. Streamable HTTP Transport - Modern & Current Standard

Current Standard - Modern HTTP-based transport with advanced features:

  • Features: βœ… Real-time event streaming, session management, resumability support
  • Performance: βœ… Better scalability and connection handling
  • Use case: Modern web applications and clients supporting HTTP streaming
  • Security: ⚠️ Local HTTP server - secure in controlled environments

Starting the Streamable HTTP Server:

# Start server with explicit risk acknowledgment
python main.py --http --iunderstandtherisks

# Server will start on http://localhost:3000/mcp
# Connect using streamable HTTP compatible clients

Features:

  • βœ… Real-time streaming - Live progress updates during operations
  • βœ… Session management - Maintains connection state
  • βœ… Event streaming - Server-Sent Events for real-time notifications
  • βœ… Better error handling - Detailed error responses
  • βœ… Modern protocol - Based on latest MCP specifications

For Streamable HTTP Client Testing:

cd clients
python mcp-cli-streamable-client.py
3. Remote HTTP Access - High Risk Advanced Use Only

⚠️ EXTREMELY DANGEROUS - READ CAREFULLY

For MCP clients that don't natively support remote connections, you can use mcp-remote via NPX:

Prerequisites:

  • Node.js and NPM installed
  • Okta MCP Server running in HTTP mode

Setup:

# 1. Install mcp-remote globally
npm install -g @anthropic/mcp-remote

# 2. Start your Okta MCP Server in HTTP mode
python main.py --http --iunderstandtherisks

# 3. Configure your MCP client (e.g., Claude Desktop)

Claude Desktop Configuration:

{
  "mcpServers": {
    "okta-mcp-server": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "http://localhost:3000/mcp"
      ],
      "env": {
        "OKTA_CLIENT_ORGURL": "https://dev-1606.okta.com",
        "OKTA_API_TOKEN": "your_actual_api_token"
      }
    }
  }
}

🚨 CRITICAL SECURITY WARNINGS:

  • NEVER use in production environments
  • NEVER expose the HTTP port (3000) to public networks
  • ANYONE with network access can control your Okta tenant
  • No authentication or authorization protection
  • All Okta operations are exposed without restrictions
  • Use only in isolated, secure development environments
  • Consider this approach only if STDIO transport is absolutely not feasible

When might you need this approach:

  • Testing MCP integrations that require HTTP transport
  • Specific client applications that can't use STDIO
  • Development scenarios requiring HTTP debugging
  • NEVER for production or shared environments
4. Server-Sent Events (SSE) - Deprecated

⚠️ DEPRECATED: SSE transport is deprecated and not recommended for new implementations.

# Run in SSE mode (requires explicit risk acknowledgment)
python main.py --sse --iunderstandtherisks
  • Use case: Legacy MCP clients that specifically require SSE (not recommended)
  • Security: ⚠️ Same HTTP security risks as Streamable HTTP
  • Recommendation: Use Streamable HTTP transport instead for all new implementations

5. Docker Deployment

The Okta MCP Server provides Docker images for all transport types, offering containerized deployment options.

Running Docker Containers

STDIO Transport (Recommended): For Claude Desktop or other MCP clients, configure to use the Docker container:

{
  "mcpServers": {
    "okta-mcp-server": {
      "command": "docker",
      "args": [
        "run", "-i", "--rm",
        "-e", "OKTA_CLIENT_ORGURL",
        "-e", "OKTA_API_TOKEN",
        "fctrid/okta-mcp-server:stdio"
      ],
      "env": {
        "OKTA_CLIENT_ORGURL": "https://your-org.okta.com",
        "OKTA_API_TOKEN": "your_api_token"
      }
    }
  }
}

Streamable HTTP Transport (Current Standard):

# Start the HTTP container
docker run -d --name okta-mcp-http \
  -p 3000:3000 \
  -e OKTA_API_TOKEN=your_api_token \
  -e OKTA_CLIENT_ORGURL=https://your-org.okta.com \
  fctrid/okta-mcp-server:http

# Configure your MCP client to connect to http://localhost:3000/mcp

SSE Transport (Deprecated - Not Recommended):

# Start the SSE container (deprecated)
docker run -d --name okta-mcp-sse \
  -p 3000:3000 \
  -e OKTA_API_TOKEN=your_api_token \
  -e OKTA_CLIENT_ORGURL=https://your-org.okta.com \
  fctrid/okta-mcp-server:sse

# Configure your MCP client to connect to http://localhost:3000/sse

Building Images Locally:

# Build all variants
docker build --target stdio -t okta-mcp-server:stdio .
docker build --target http -t okta-mcp-server:http .
docker build --target sse -t okta-mcp-server:sse .

⚠️ Good to Know

Beta Release πŸ§ͺ

  • Completely rewritten architecture with FastMCP 2.0
  • Enhanced stability and performance compared to previous alpha releases
  • Comprehensive tool system with improved AI integration
  • More suitable for development and testing environments
  • Production readiness being evaluated with enhanced security features

Security First πŸ›‘οΈ

  • Designed for least-privilege operation
  • Default read-only access to Okta resources
  • Future write operations will require explicit approval flows

Current Limitations πŸ”

  • Starting with a limited set of read-only tools for users and groups
  • Planning to expand API coverage rapidly in upcoming releases
  • Some complex Okta relationships not yet exposed
  • Performance with very large Okta instances not yet optimized
  • Requires direct network access to Okta API endpoints

πŸ—ΊοΈ Roadmap

v0.1.0-BETA - Current (MAJOR ARCHITECTURAL OVERHAUL!)

  • Complete migration to FastMCP 2.0 architecture
  • Comprehensive rewrite of all tools with enhanced annotations
  • New unified CLI client supporting multiple transports
  • Eliminated tool_registry.py dependency for cleaner codebase
  • Advanced bearer token support with jwks_uri validation
  • Significantly enhanced error handling and validation
  • Performance optimizations and modern async patterns

v0.3.0 - Previous

  • Streamable HTTP transport support
  • Real-time event streaming
  • Session management and resumability
  • Enhanced client applications

Future plans include:

  • Complete user lifecycle operations
  • Application assignment management
  • Group membership operations
  • Factor enrollment and verification
  • Policy and rule management
  • Approval workflows for sensitive operations
  • Multi-channel approval options (web, email, Slack)
  • Audit logging and compliance reporting
  • System log integration
  • Security insights generation
  • Multi-tenant support
  • Role-based access control

πŸ†˜ Need Help?

Before raising an issue, check:

  1. πŸ“ Server configuration
  2. πŸ”‘ Okta API permissions
  3. πŸ”Œ MCP client compatibility
  4. πŸ“Š Server logs

Still having problems? Open an issue on GitHub or email (response times may vary)

πŸ’‘ Feature Requests & Ideas

Have an idea or suggestion? Open a feature request on GitHub!

πŸ‘₯ Contributors

Interested in contributing? We'd love to have you! Contact for collaboration opportunities.

βš–οΈ Legal Stuff

Check out for the fine print.


🌟 © 2025 Fctr Identity. All rights reserved. Made with ❀️ for the Okta and AI communities.