SDKs & Tooling
Chidori ships as a single Rust binary. The Python SDK is a pure-stdlib HTTP client — no pip install, no native bindings. Tools live in .star files next to your agents. This page covers what you install and how the pieces fit together.
Project layout
A typical Chidori project:
my-project/
├── agents/ # .star agent files
│ ├── summarizer.star
│ └── webhook.star
├── prompts/ # Jinja2 templates
│ └── research.jinja
├── tools/ # .star tool files
│ └── search.star
└── chidori.star # project config (optional)The runtime looks for agents in agents/, tools in tools/, and prompts in prompts/ relative to the project root.
The CLI
chidori run [--input key=value]... [--trace] [--verbose]
chidori check
chidori tools [--dir ]...
chidori serve [--port 8080] [--verbose] Input formats for run:
--input key=value— string value--input key=@file.txt— read the value from a file--input '{"key": "value"}'— JSON object merged into inputs
Provider environment variables:
ANTHROPIC_API_KEY— use Anthropic directly (claude-*models)OPENAI_API_KEY— use OpenAI directly (gpt-*,o1-*,o3-*)LITELLM_API_URL+LITELLM_API_KEY— use a LiteLLM proxy (or any OpenAI-compatible endpoint) as a catch-all
Defining tools
Tools are plain .star files that export one function. The function name becomes the tool name, the docstring becomes its LLM-facing description, and parameter defaults auto-generate a JSON schema for function-calling.
tools/search.star
def web_search(query, max_results = 5):
"""Search the web and return results."""
response = http("GET", "https://api.search.example/search", params = {
"q": query,
"limit": max_results,
})
return response["results"]tools/analyze.star
def analyze(data):
"""Run heavy analysis via external service."""
return http("POST", "http://localhost:9000/analyze", json = {"data": data})For tools that need native libraries, wrap an external service over HTTP — keeps the tool surface pure Starlark and side-effects visible in the call log.
List registered tools:
chidori tools --dir tools/The Python SDK
The SDK is a pure-stdlib HTTP client. Copy the sdk/python/chidori/ directory into your project — no install required, no native bindings.
import sys
sys.path.insert(0, "sdk/python")
from chidori import AgentClient, Checkpoint
client = AgentClient("http://localhost:8080")
# Run an agent — live LLM calls
session = client.run({"document": "Rust is great."})
print(session.output) # {"summary": "...", "action_items": "..."}
print(session.status) # "completed"
# Save and replay
checkpoint = session.checkpoint()
checkpoint.save("session.json")
cp = Checkpoint.load("session.json")
replayed = client.replay(cp)
assert replayed.output == session.output
# Iterate over many documents
for doc in documents:
s = client.run({"document": doc})
s.checkpoint().save(f"checkpoints/{s.id}.json")
# List sessions on the server
for s in client.list_sessions():
print(s["id"], s["status"])The Rust runtime
If you need to embed Chidori in a larger Rust service, the runtime is a library crate. The chidori CLI binary is a thin wrapper around it. See the Rust API docs for the RuntimeContext and HostFunction traits if you want to add new host functions or swap the LLM provider layer.
Output format
Agent output is always JSON on stdout:
- Dicts → JSON objects
- Lists → JSON arrays
- Strings → JSON strings
- Numbers → JSON numbers
- Booleans → JSON booleans
None→ JSON null
Traces and diagnostics go to stderr, so you can pipe stdout into jq cleanly:
chidori run agents/summarizer.star --input document=@notes.txt | jq '.summary'