Go Library Guide
Use oneagent as a Go library to run AI agents directly from your application, without the oa CLI.
Install
go get github.com/1broseidon/oneagent@latestLoading Backends
Load the embedded defaults plus any user overrides from ~/.config/oneagent/backends.json:
backends, err := oneagent.LoadBackends("")
if err != nil {
log.Fatal(err)
}
client := oneagent.Client{Backends: backends}Or load from an explicit config file instead:
backends, err := oneagent.LoadBackends("/path/to/backends.json")
if err != nil {
log.Fatal(err)
}Or merge the embedded defaults with an app-owned override path:
backends, err := oneagent.LoadBackendsWithOptions(oneagent.LoadOptions{
IncludeEmbedded: true,
OverridePath: "/path/to/app/backends.json",
})
if err != nil {
log.Fatal(err)
}Running a Prompt
For a single final response:
resp := client.Run(oneagent.RunOpts{
Backend: "claude",
Prompt: "explain this codebase",
CWD: "/path/to/project",
})
if resp.Error != "" {
log.Fatal(resp.Error)
}
fmt.Println(resp.Result)
fmt.Println(resp.Session)Every backend returns the same Response shape:
type Response struct {
Result string // the agent's final answer
Session string // native session ID, for resuming later
ThreadID string // portable thread ID, if using threads
Backend string // which backend produced this response
Error string // non-empty on failure
}Streaming
Use RunStream to receive incremental updates as the agent works:
resp := client.RunStream(oneagent.RunOpts{
Backend: "claude",
Prompt: "review the repo",
CWD: "/path/to/project",
}, func(ev oneagent.StreamEvent) {
switch ev.Type {
case "start":
fmt.Println("run:", ev.RunID, "started at", ev.TS)
case "session":
fmt.Println("session:", ev.Session)
case "activity":
fmt.Println("activity:", ev.Activity)
case "heartbeat":
fmt.Println("heartbeat:", ev.RunID, ev.TS)
case "delta":
fmt.Print(ev.Delta)
case "done":
fmt.Println("\nfinal:", ev.Result)
case "error":
fmt.Println("\nerror:", ev.Error)
}
})
if resp.Error != "" {
log.Fatal(resp.Error)
}Every streamed event includes a per-attempt RunID and timestamp TS. start and heartbeat are emitted by the library itself, so callers can supervise long-running backends without backend-specific parsing.
The event types are intentionally small: session, activity, delta, done, and error.
Portable Threads
Use RunWithThread or RunWithThreadStream to maintain conversation history across runs — even across different backends:
resp := client.RunWithThread(oneagent.RunOpts{
Backend: "codex",
ThreadID: "auth-fix",
Prompt: "continue debugging",
CWD: "/path/to/project",
Source: "my-app", // tags turns with who produced them
})This gives you:
- Native session reuse when continuing on the same backend
- Automatic context replay when switching to a different backend
- Local thread storage with compaction to keep long conversations manageable
- File locking — safe for multiple processes to share a thread concurrently
- Turn attribution — each turn records its
Sourceso you can distinguish bot, cron, and user turns
Thread management:
ids, err := client.ListThreads()
thread, err := client.LoadThread("auth-fix")
err = client.CompactThread("auth-fix", "claude")Custom Thread Storage
By default, threads are stored on disk at ~/.local/state/oneagent/threads/. To store threads elsewhere (for example, in a database or isolated directory), inject a custom store:
store := oneagent.FilesystemStore{Dir: "/tmp/my-app-threads"}
client := oneagent.Client{
Backends: backends,
Store: store,
}You can also implement the Store interface for fully custom storage:
type Store interface {
LoadThread(id string) (*Thread, error)
SaveThread(thread *Thread) error
ListThreads() ([]string, error)
}Hooks
Use PreRun and PostRun callbacks to run logic before and after agent execution:
resp := client.Run(oneagent.RunOpts{
Backend: "claude",
ThreadID: "daily-review",
Prompt: "summarize today",
Source: "cron-nightly",
PreRun: func(opts *oneagent.RunOpts) error {
// modify opts, set up worktree, validate environment
opts.CWD = "/tmp/workspace"
return nil // return error to abort
},
PostRun: func(ctx *oneagent.HookContext) {
// notify, clean up, log — receives the original prompt, not replay-expanded
fmt.Println("Done:", ctx.Response.Result)
},
})For shell-based hooks (useful from the CLI or config), use PreRunCmd and PostRunCmd:
resp := client.Run(oneagent.RunOpts{
Backend: "claude",
Prompt: "summarize today",
PostRunCmd: "curl -s -X POST https://hooks.example.com/notify -d @-",
})Post-run shell hooks receive the result on stdin and environment variables: OA_BACKEND, OA_THREAD_ID, OA_SOURCE, OA_MODEL, OA_CWD, OA_SESSION, OA_ERROR, OA_EXIT. Pre-run hooks receive the same minus session/error/exit.
Hooks from the backend config (pre_run/post_run fields) and hooks from RunOpts both execute — config first, then per-invocation. Pre-run hooks abort the run on non-zero exit. Post-run hooks are best-effort.
See the Hooks guide for the full reference.
Preflight Checks
Validate that a backend is runnable before queuing work. This catches missing CLI binaries and auth issues instantly:
// Check via Client (looks up the backend by name)
if err := client.PreflightCheck("gemini"); err != nil {
log.Fatalf("backend not ready: %v", err)
}
// Or check a Backend directly (no Client needed)
b := backends["claude"]
if err := oneagent.PreflightCheckBackend("claude", b); err != nil {
log.Fatalf("backend not ready: %v", err)
}Preflight performs two checks:
- Binary resolution — verifies the CLI binary exists in
$PATHor the backend's configuredpaths. - Probe (optional) — if the backend has a
probecommand configured (e.g.claude --version), it runs it with a 10-second timeout. This catches missing API keys, expired auth tokens, or broken installations.
All built-in backends ship with default probe commands. Custom backends can add one via the "probe" field in backends.json.
Typical Integration Pattern
Most applications follow this pattern:
- Keep your own app-level state (selected backend, model, current thread ID).
- Call
client.RunWithThreadStreamwith the current state. - Render
activityanddeltaevents incrementally in your UI. - Finalize the UI with
resp.Resultonce the stream ends.
See examples/consumer.md for a complete, runnable example.