Agent Orcha Studio Dashboard

Orchestrate AI Agents with Declarative YAML

Build multi-agent systems in minutes. Define everything in version-controlled YAML. No spaghetti code. Use AI to build your workflows.

$ npx agent-orcha init my-project
✓ Project initialized with example agents and workflows
$ cd my-project
$ npx agent-orcha start
Server running on http://localhost:3000

Get Started

1

Initialize

One command to scaffold your project with example agents, workflows, and configurations. Start with working examples.

npx agent-orcha init my-project
2

Configure with AI

Use your favorite AI coding client (Claude Code, Cursor, VSCode) to edit and generate YAML configs. Point your AI assistant to llm.md as the reference.

agents/
  researcher.agent.yaml
workflows/
  research-paper.workflow.yaml
3

Start

Launch the API server with streaming support, logging, and CORS. Access via REST API or web UI.

npx agent-orcha start
# Server ready at :3000
4

Test in Studio

Test agents, workflows, and knowledge stores via the built-in web dashboard. Chat with agents, run workflows, and browse knowledge stores.

http://localhost:3000
5

Deploy with Docker

Deploy to any host using the ddalcu/agent-orcha Docker image. Mount your project directory and expose port 3000.

docker run -p 3000:3000 \
  -v ./my-project:/data \
  ddalcu/agent-orcha start

Why Agent Orcha?

Declarative AI

Define agents, workflows, and infrastructure in clear, version-controlled YAML files. Track changes in Git, collaborate with your team.

Model Agnostic

Seamlessly swap between OpenAI, Gemini, Anthropic, or local LLMs (Ollama, LM Studio) without rewriting logic. Zero vendor lock-in.

Universal Tooling

Leverage the Model Context Protocol (MCP) to connect agents to any external service, API, or database instantly. Extensible by design.

Knowledge Stores

Built-in vector stores (Chroma, Pinecone, Qdrant) and GraphRAG knowledge graphs make semantic search and entity analysis first-class citizens.

Forever Open Source

No pricing, no commercial licenses, no vendor lock-in. MIT licensed and always will be. Built by developers, for developers, with complete transparency.

AI-First Design

Built so AI can build AI agents. YAML is perfect for LLMs to read and write. Use Claude, ChatGPT, or any AI to generate, modify, and maintain your agent configurations effortlessly.

Agent Orcha Studio

Agent Chat

Agent Chat

Built-in IDE

Built-in IDE

Knowledge Stores

Knowledge Stores

MCP Servers

MCP Servers

Shell / Terminal

Shell / Terminal

Workflows

Workflows

Declarative Configuration Meets Powerful APIs

Define Your Agent

Create agents with simple YAML configuration. Reference LLM configs, define tools, set prompts.

YAML
name: researcher
description: Researches topics using web and vectors
version: "1.0.0"

llm:
  name: default
  temperature: 0.5

prompt:
  system: |
    You are a thorough researcher.
    Use available tools to gather information
    before responding.
  inputVariables:
    - topic
    - context

tools:
  - mcp:fetch
  - knowledge:docs
  - function:custom-tool

output:
  format: text

Invoke with REST API

Simple HTTP endpoints for agent invocation, workflow execution, and vector search.

Bash
curl -X POST \
  http://localhost:3000/api/agents/researcher/invoke \
  -H "Content-Type: application/json" \
  -d '{
    "input": {
      "topic": "machine learning trends",
      "context": "2024 overview"
    }
  }'
JSON Response
{
  "output": "Comprehensive research results...",
  "metadata": {
    "tokensUsed": 1523,
    "toolCalls": ["fetch", "vector_search"],
    "duration": 2341
  }
}

Complete AI Orchestration Stack

YAML Configs

Agents
Workflows
Knowledge Stores
Functions

Orchestrator

Agent Executor
Workflow Engine
LangGraph Runtime
LLM Factory

LLM Providers

OpenAI
Anthropic
Gemini
Local (Ollama)

Integrations

MCP Servers
Knowledge Stores
Custom Functions
Studio IDE

Full OpenAPI Specification

Agent Orcha ships with a complete OpenAPI 3.0 specification. Import it into Swagger UI, Postman, or any API client for interactive exploration.

JSON
{
  "openapi": "3.0.0",
  "info": {
    "title": "Agent Orcha API",
    "version": "0.0.3",
    "description": "API for orchestrating agents, workflows, knowledge stores, functions, MCP servers, and LLMs."
  },
  "servers": [
    {
      "url": "http://localhost:3000",
      "description": "Local Development Server"
    }
  ]
}

Start Building AI Agents Today

Join developers building the next generation of AI systems with declarative orchestration.