A localized multi-agent coordination hub. Power specialized droids with local LLMs (Ollama). Monitor their activity via Chainlit and Arize Phoenix. Human-in-the-loop safeguards ensure no agent modifies your files without clearance.
- Supervisor: Routes queries to Researcher (general info) or Coder (Python/filesystem)
- Researcher: Answers factual questions, no tools
- Coder: Handles code and files with
read_file,write_file,list_directory—file writes require approval - Persistence: SQLite (
state.db) via LangGraph SqliteSaver - Observability: Phoenix at http://localhost:6006 for tracing
- Python 3.11+
- Ollama running at
http://localhost:11434 - Model:
llama3.1:8b(or configure inconfig.yaml)
# 1. Create virtual environment
python -m venv .venv
.venv\Scripts\activate # Windows
# source .venv/bin/activate # macOS/Linux
# 2. Install dependencies
pip install -r requirements.txt
# 3. Ensure Ollama is running and has a model
ollama pull llama3.1:8b
# 4. (Optional) Start Phoenix for observability
phoenix serve
# Then open http://localhost:6006
# 5. Run the hub
chainlit run app.py --port 8080Open http://localhost:8080 in your browser.
- Ask a question – The Supervisor routes to Researcher or Coder.
- File operations – If the Coder proposes a file write, you'll see Approve, Modify, or Reject buttons before it runs.
- Sessions – Conversations persist in
state.dbacross restarts.
Edit config.yaml to:
- Change Ollama base URL and default model
- Customize agent system prompts
- Set SQLite path and Phoenix endpoint
kyber-command/
├── agents/
│ ├── state.py # Pydantic/TypedDict state
│ ├── supervisor.py # Router agent
│ ├── researcher.py # General info agent
│ └── coder.py # Code/files agent (HITL tools)
├── config.yaml # Dynamic configuration
├── graph_engine.py # LangGraph definition + SqliteSaver
├── app.py # Chainlit entry (chat + HITL UI)
├── observability.py # Phoenix instrumentation
├── requirements.txt
└── README.md
With Phoenix running (phoenix serve), all agent "thought chains" are traced at http://localhost:6006. The app uses OpenInference instrumentation for LangChain/LangGraph.