orla

dorcha-inc/orla

3.6

If you are the rightful owner of orla and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Orla is a fast and extensible Model Context Protocol (MCP) server and router with auto discovery capabilities.

Orla is a high performance agent execution engine that can run as a service or as a standalone tool. Orla sits above LLM backends including SGLang, Ollama, and vLLM. It provides a simple unified API for developing and running agents. With Orla, you can easily orchestrate your agentic workflows across multiple models, LLM backends, GPUs (or CPUs), and cloud instances.

brew install --cask dorcha-inc/orla/orla

Documentation

For the complete documentation, go to our website, https://orlaserver.github.io.