Build production-ready multi-agent systems in minutes. Define everything in version-controlled YAML. No spaghetti code. Use AI to build your workflows.
One command to scaffold your project with example agents, workflows, and configurations. Start with working examples.
npx agent-orcha init my-project
Edit YAML files to define agents, workflows, and vector stores. Version-controlled and human-readable. No code required.
agents/
researcher.agent.yaml
workflows/
research-paper.workflow.yaml
Launch the production-ready API server with streaming support, logging, and CORS. Access via REST API or web UI.
npx agent-orcha start
# Server ready at :3000
Define agents, workflows, and infrastructure in clear, version-controlled YAML files. Track changes in Git, collaborate with your team.
Seamlessly swap between OpenAI, Gemini, Anthropic, or local LLMs (Ollama, LM Studio) without rewriting logic. Zero vendor lock-in.
Leverage the Model Context Protocol (MCP) to connect agents to any external service, API, or database instantly. Extensible by design.
Built-in vector store integration (Chroma, Memory) makes semantic search and knowledge retrieval a first-class citizen.
No pricing, no commercial licenses, no vendor lock-in. MIT licensed and always will be. Built by developers, for developers, with complete transparency.
Built so AI can build AI agents. YAML is perfect for LLMs to read and write. Use Claude, ChatGPT, or any AI to generate, modify, and maintain your agent configurations effortlessly.
Create agents with simple YAML configuration. Reference LLM configs, define tools, set prompts.
name: researcher
description: Researches topics using web and vectors
version: "1.0.0"
llm:
name: default
temperature: 0.5
prompt:
system: |
You are a thorough researcher.
Use available tools to gather information
before responding.
inputVariables:
- topic
- context
tools:
- mcp:fetch
- vector:knowledge
- function:custom-tool
output:
format: text
Simple HTTP endpoints for agent invocation, workflow execution, and vector search.
curl -X POST \
http://localhost:3000/api/agents/researcher/invoke \
-H "Content-Type: application/json" \
-d '{
"input": {
"topic": "machine learning trends",
"context": "2024 overview"
}
}'
{
"output": "Comprehensive research results...",
"metadata": {
"tokensUsed": 1523,
"toolCalls": ["fetch", "vector_search"],
"duration": 2341
}
}
Agents
Workflows
Vector Stores
Functions
Agent Executor
Workflow Engine
Tool Registry
LLM Factory
OpenAI
Anthropic
Gemini
Local (Ollama)
MCP Servers
Vector Stores
Custom Functions
REST API
Join developers building the next generation of AI systems with declarative orchestration.