Agent Orcha Studio Dashboard

Orchestrate AI Agents with Declarative YAML

Build multi-agent systems in minutes. Define everything in version-controlled YAML. No spaghetti code. Use AI to build your workflows.

$ docker run -p 3000:3000 -e AUTH_PASSWORD=mypass -v ./my-project:/data ddalcu/agent-orcha
✓ Project initialized with example agents and workflows
Server running on http://localhost:3000

Get Started

1

Run

One command to get started. An empty workspace is automatically scaffolded with example agents, workflows, and configurations.

docker run -p 3000:3000 \
  -e AUTH_PASSWORD=mypass \
  -v ./my-project:/data \
  ddalcu/agent-orcha
2

Configure with AI

Use your favorite AI coding client (Claude Code, Cursor, VSCode) to edit and generate YAML configs. Point your AI assistant to llm.md as the reference.

agents/
  researcher.agent.yaml
workflows/
  research-paper.workflow.yaml
3

Test in Studio

Test agents, workflows, and knowledge stores via the built-in web dashboard. Chat with agents, run workflows, and browse knowledge stores.

http://localhost:3000
4

Use Docker Compose

For persistent setups, use a docker-compose.yaml to manage your container, environment variables, and volume mounts.

AUTH_PASSWORD=mypass docker compose up

Why Agent Orcha?

Declarative AI

Define agents, workflows, and infrastructure in clear, version-controlled YAML files. Track changes in Git, collaborate with your team.

Model Agnostic

Seamlessly swap between OpenAI, Gemini, Anthropic, or local LLMs (Ollama, LM Studio) without rewriting logic. Zero vendor lock-in.

Universal Tooling

Leverage the Model Context Protocol (MCP) to connect agents to any external service, API, or database instantly. Extensible by design.

Knowledge Stores

Built-in SQLite vector stores with optional direct graph mapping make semantic search and entity analysis first-class citizens. No external databases required.

Forever Open Source

No pricing, no commercial licenses, no vendor lock-in. MIT licensed and always will be. Built by developers, for developers, with complete transparency.

AI-First Design

Built so AI can build AI agents. YAML is perfect for LLMs to read and write. Use Claude, ChatGPT, or any AI to generate, modify, and maintain your agent configurations effortlessly.

Agent Orcha Studio

Agent Chat

Agent Chat

Visual Agent Composer

Visual Agent Composer

Knowledge Vector Search

Knowledge Vector Search

Knowledge Graph

Knowledge Graph

Declarative Configuration Meets Powerful APIs

Define Your Agent

Create agents with simple YAML configuration. Reference LLM configs, define tools, set prompts.

YAML
name: researcher
description: Researches topics using web and vectors
version: "1.0.0"

llm:
  name: default
  temperature: 0.5

prompt:
  system: |
    You are a thorough researcher.
    Use available tools to gather information
    before responding.
  inputVariables:
    - topic
    - context

tools:
  - mcp:fetch
  - knowledge:docs
  - function:custom-tool

output:
  format: text

Invoke with REST API

Simple HTTP endpoints for agent invocation, workflow execution, and vector search.

Bash
curl -X POST \
  http://localhost:3000/api/agents/researcher/invoke \
  -H "Content-Type: application/json" \
  -d '{
    "input": {
      "topic": "machine learning trends",
      "context": "2024 overview"
    }
  }'
JSON Response
{
  "output": "Comprehensive research results...",
  "metadata": {
    "tokensUsed": 1523,
    "toolCalls": ["fetch", "vector_search"],
    "duration": 2341
  }
}

Complete AI Orchestration Stack

YAML Configs

Agents
Workflows
Knowledge Stores
Functions

Orchestrator

Agent Executor
Workflow Engine
ReAct Runtime
LLM Factory

LLM Providers

OpenAI
Anthropic
Gemini
Local (Ollama)

Integrations

MCP Servers
Knowledge Stores
Custom Functions
Studio IDE

Full OpenAPI Specification

Agent Orcha ships with a complete OpenAPI 3.0 specification. Import it into Swagger UI, Postman, or any API client for interactive exploration.

JSON
{
  "openapi": "3.0.0",
  "info": {
    "title": "Agent Orcha API",
    "version": "0.0.5",
    "description": "API for orchestrating agents, workflows, knowledge stores, functions, MCP servers, and LLMs."
  },
  "servers": [
    {
      "url": "http://localhost:3000",
      "description": "Local Development Server"
    }
  ]
}

Start Building AI Agents Today

Join developers building the next generation of AI systems with declarative orchestration.