Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,118 changes: 209 additions & 909 deletions README.md

Large diffs are not rendered by default.

102 changes: 93 additions & 9 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,21 @@ See: `task_context_example.py`, `worker_example.py`

---

### AI/LLM Workflows

See [agentic_workflows/](agentic_workflows/) for the full set of AI agent examples.

| File | Description | Run |
|------|-------------|-----|
| **agentic_workflows/llm_chat.py** | Automated multi-turn LLM chat | `python examples/agentic_workflows/llm_chat.py` |
| **agentic_workflows/llm_chat_human_in_loop.py** | Interactive chat with WAIT task pauses | `python examples/agentic_workflows/llm_chat_human_in_loop.py` |
| **agentic_workflows/multiagent_chat.py** | Multi-agent debate with moderator routing | `python examples/agentic_workflows/multiagent_chat.py` |
| **agentic_workflows/function_calling_example.py** | LLM picks Python functions to call | `python examples/agentic_workflows/function_calling_example.py` |
| **agentic_workflows/mcp_weather_agent.py** | AI agent with MCP tool calling | `python examples/agentic_workflows/mcp_weather_agent.py "What's the weather?"` |
| **rag_workflow.py** | RAG pipeline: markitdown, pgvector, search, answer | `python examples/rag_workflow.py file.pdf "question"` |

---

### Monitoring

| File | Description | Run |
Expand Down Expand Up @@ -174,6 +189,65 @@ python examples/prompt_journey.py

---

### RAG Pipeline Setup

Complete RAG (Retrieval Augmented Generation) pipeline example:

```bash
# 1. Install dependencies
pip install conductor-python "markitdown[pdf]"

# 2. Configure (requires Orkes Conductor with AI/LLM support)
# - Vector DB integration named "postgres-prod" (pgvector)
# - LLM provider named "openai" with a valid API key
export CONDUCTOR_SERVER_URL="http://localhost:7001/api"

# 3. Run RAG workflow
python examples/rag_workflow.py examples/goog-20251231.pdf "What were Google's total revenues?"
```

**Pipeline:** `convert_to_markdown` → `LLM_INDEX_TEXT` → `WAIT` → `LLM_SEARCH_INDEX` → `LLM_CHAT_COMPLETE`

**Features:**
- Document conversion (PDF, Word, Excel → Markdown via [markitdown](https://github.com/microsoft/markitdown))
- Vector database ingestion into pgvector with OpenAI `text-embedding-3-small` embeddings
- Semantic search with configurable result count
- Context-aware answer generation with `gpt-4o-mini`

---

### MCP Tool Integration Setup

MCP (Model Context Protocol) agent example:

```bash
# 1. Install MCP weather server
pip install mcp-weather-server

# 2. Start MCP server
python3 -m mcp_weather_server \
--mode streamable-http \
--host localhost \
--port 3001 \
--stateless

# 3. Run AI agent
export OPENAI_API_KEY="your-key"
export ANTHROPIC_API_KEY="your-key"
python examples/agentic_workflows/mcp_weather_agent.py "What's the weather in Tokyo?"

# Or simple mode (direct tool call):
python examples/agentic_workflows/mcp_weather_agent.py "Temperature in New York" --simple
```

**Features:**
- MCP tool discovery
- LLM-based planning (agent decides which tool to use)
- Tool execution via HTTP/Streamable transport
- Natural language response generation

---

## 🎓 Learning Path (60-Second Guide)

```bash
Expand All @@ -189,7 +263,11 @@ python examples/worker_configuration_example.py
# 4. Workflows (10 min)
python examples/dynamic_workflow.py

# 5. Monitoring (5 min)
# 5. AI/LLM Workflows (15 min)
python examples/agentic_workflows/llm_chat.py
python examples/rag_workflow.py examples/goog-20251231.pdf "What were Google's total revenues?"

# 6. Monitoring (5 min)
python examples/metrics_example.py
curl http://localhost:8000/metrics
```
Expand All @@ -214,6 +292,15 @@ examples/
│ ├── workflow_status_listner.py # Workflow events
│ └── test_workflows.py # Unit tests
├── AI/LLM Workflows
│ ├── rag_workflow.py # RAG pipeline (markitdown + pgvector)
│ └── agentic_workflows/ # Agentic AI examples
│ ├── llm_chat.py # Multi-turn LLM chat
│ ├── llm_chat_human_in_loop.py # Interactive chat with WAIT
│ ├── multiagent_chat.py # Multi-agent debate
│ ├── function_calling_example.py # LLM function calling
│ └── mcp_weather_agent.py # MCP tool calling agent
├── Monitoring
│ ├── metrics_example.py # Prometheus metrics
│ ├── event_listener_examples.py # Custom listeners
Expand Down Expand Up @@ -245,14 +332,11 @@ examples/
│ └── other_workers/
└── orkes/ # Orkes-specific features
├── ai_orchestration/ # AI/LLM integration
│ ├── open_ai_chat_gpt.py
│ ├── open_ai_function_example.py
│ └── vector_db_helloworld.py
└── workers/ # Advanced patterns
├── http_poll.py
├── sync_updates.py
└── wait_for_webhook.py
├── vector_db_helloworld.py # Vector DB operations
├── agentic_workflow.py # AI agent (AIOrchestrator)
├── http_poll.py
├── sync_updates.py
└── wait_for_webhook.py
```

---
Expand Down
Loading
Loading