Integrations
The cleanest way to integrate Chidori with external systems is as an event-driven agent: one file, one agent(event) function, one HTTP port. Every external system that can POST JSON becomes a source of events.
The pattern
chidori serve agents/integrations.star --port 8080def agent(event):
path = event["path"]
if path == "/github":
return handle_github(event)
if path == "/slack":
return handle_slack(event)
if path == "/alert":
return handle_alert(event)
return {"status": 404, "body": {"error": "Unknown path: " + path}}Then point each external system at the matching URL — one agent, many integrations.
GitHub webhooks
GitHub sends a JSON body and a x-github-event header identifying the event type.
def dict_get(d, key, default = None):
if key in d:
return d[key]
return default
def handle_github(event):
gh_event = dict_get(event["headers"], "x-github-event", "")
body = event["body"]
if gh_event == "pull_request" and dict_get(body, "action") == "opened":
return review_pr(body["pull_request"])
if gh_event == "issues" and dict_get(body, "action") == "opened":
return triage_issue(body["issue"])
return {"status": 200, "body": {"ignored": gh_event}}
def review_pr(pr):
diff = http("GET", pr["diff_url"], headers = {"Accept": "application/vnd.github.v3.diff"})
review = prompt(
"Review this pull request. Flag correctness, security, and style issues.\n\n" +
"Title: " + pr["title"] + "\n\nDiff:\n" + diff["body"],
max_tokens = 1000,
)
http("POST", pr["comments_url"],
headers = {"Authorization": "Bearer " + env("GITHUB_TOKEN")},
json = {"body": review},
)
return {"status": 200, "body": {"reviewed": pr["number"]}}
def triage_issue(issue):
classification = prompt(
"Classify this issue as 'bug', 'feature', 'question', or 'duplicate'. " +
"Respond with just the word.\n\nTitle: " + issue["title"] +
"\n\nBody:\n" + (issue["body"] or ""),
max_tokens = 20,
)
http("POST", issue["url"] + "/labels",
headers = {"Authorization": "Bearer " + env("GITHUB_TOKEN")},
json = {"labels": [classification.strip().lower()]},
)
return {"status": 200, "body": {"triaged": issue["number"], "label": classification}}Register the webhook in GitHub under Settings → Webhooks, pointing at https://your-host/github.
Slack slash commands
Slack POSTs a form-encoded body; Chidori parses it into event["body"] as a dict:
def handle_slack(event):
body = event["body"]
command = dict_get(body, "command", "")
text = dict_get(body, "text", "")
if command == "/summarize":
return summarize_thread(dict_get(body, "channel_id"), dict_get(body, "thread_ts"))
if command == "/explain":
return {"status": 200, "body": {"text": prompt("Explain like I'm 5:\n" + text)}}
return {"status": 200, "body": {"text": "Unknown command: " + command}}Alert responders
Chidori is a natural fit for on-call copilots — an alert comes in, the agent fetches related metrics, proposes a diagnosis, and pages a human only if it can't self-resolve.
def handle_alert(event):
alert = event["body"]
severity = dict_get(alert, "severity", "unknown")
service = dict_get(alert, "service", "unknown")
# Pull related metrics and recent deploys in parallel
context = parallel([
lambda: tool("prometheus_query", query = "rate(errors{service=\"" + service + "\"}[5m])"),
lambda: tool("recent_deploys", service = service, hours = 2),
lambda: tool("recent_alerts", service = service, hours = 6),
])
diagnosis = prompt(
template("prompts/alert.jinja",
alert = alert,
metrics = context[0],
deploys = context[1],
related = context[2],
),
max_tokens = 500,
)
if severity == "critical":
tool("page_oncall", service = service, diagnosis = diagnosis)
return {"status": 200, "body": {"diagnosis": diagnosis, "severity": severity}}Secrets
Never inline secrets. Read them with env() and pass them through to http():
http("POST", "https://api.example.com/...",
headers = {"Authorization": "Bearer " + env("API_TOKEN")},
json = payload,
)Debugging failed integrations
Because every integration run is a session under chidori serve, any production failure is a replayable checkpoint. Grab the checkpoint, run it locally, edit the offending call's cached result in the debugger to branch the history, and iterate — zero additional LLM spend.