Skip to main content

Documentation Index

Fetch the complete documentation index at: https://smithers.sh/llms.txt

Use this file to discover all available pages before exploring further.

AnthropicAgent and OpenAIAgent are thin wrappers around the AI SDK ToolLoopAgent with class-style ergonomics matching the CLI agents.

Import

import {
  AnthropicAgent,
  OpenAIAgent,
  tools,
} from "smithers-orchestrator";
import { stepCountIs } from "ai";

Quick Start

const claude = new AnthropicAgent({
  model: "claude-opus-4-6",
  tools,
  instructions: "You are a careful planner.",
  stopWhen: stepCountIs(40),
});

const codex = new OpenAIAgent({
  model: "gpt-5.3-codex",
  tools,
  instructions: "You are a precise implementation agent.",
  stopWhen: stepCountIs(40),
});
{/* outputs comes from createSmithers() */}
<Task id="plan" output={outputs.plan} agent={claude}>
  Analyze the repository and propose a migration plan.
</Task>

Model Input

Both classes accept a model ID string ("claude-opus-4-6", "gpt-5.3-codex") or a prebuilt AI SDK language model instance.

Options

Constructors forward standard AI SDK ToolLoopAgent settings: instructions, tools, stopWhen, maxOutputTokens, temperature, providerOptions, prepareCall. The wrapper adds model, which resolves model-ID strings automatically. For OpenAIAgent, pass baseURL and apiKey directly when targeting an OpenAI-compatible endpoint instead of the default OpenAI API. This is the simplest path for local servers such as llama.cpp:
const local = new OpenAIAgent({
  model: "llama-3.1-8b-instruct",
  baseURL: "http://127.0.0.1:8080/v1",
  apiKey: "none",
  tools,
  instructions: "You are a local coding assistant.",
  stopWhen: stepCountIs(40),
});
Set apiKey: "none" in the OpenAIAgent config when your local server accepts OpenAI-compatible requests but does not require a real key. Some OpenAI-compatible local servers accept chat requests but do not reliably implement JSON schema structured output. For those servers, keep the output schema on the Smithers task and disable native structured output on the agent so Smithers uses prompt-based JSON extraction instead:
const local = new OpenAIAgent({
  model: "llama-3.1-8b-instruct",
  baseURL: "http://127.0.0.1:8080/v1",
  apiKey: "none",
  nativeStructuredOutput: false,
  tools,
  instructions: "You are a local coding assistant.",
  stopWhen: stepCountIs(40),
});
For advanced provider setup, create the AI SDK OpenAI provider yourself and pass the prebuilt model into OpenAIAgent:
import { createOpenAI } from "@ai-sdk/openai";

const localOpenAI = createOpenAI({
  baseURL: "http://127.0.0.1:8080/v1",
  apiKey: "none",
});

const local = new OpenAIAgent({
  model: localOpenAI("llama-3.1-8b-instruct"),
  tools,
  instructions: "You are a local coding assistant.",
  stopWhen: stepCountIs(40),
});
Use the createOpenAI path when you need provider-level configuration beyond baseURL and apiKey; in that form, apiKey: "none" belongs in the createOpenAI config.

Hijack Support

SDK agents do not reopen a provider-native CLI. Smithers persists the agent conversation and reopens it through a Smithers-managed REPL via smithers hijack <runId>. Live-run behavior:
  • Smithers captures response history after each step via onStepFinish.
  • smithers hijack waits until history is durable, cancels the live run, and opens the REPL.
  • On clean REPL exit, Smithers writes updated message history back and resumes the workflow automatically.
Limits:
  • Conversation hijack stays on the same agent implementation. Cross-engine hijack is not supported.
  • Smithers reconstructs the original task agent from the workflow source.

CLI vs SDK

CLI AgentsSDK Agents
BillingProvider subscription / local CLIAPI billing
ToolsProvider CLI tool ecosystemSmithers tools sandbox
FlexibilityNative CLI flagsAI SDK providerOptions
Pass a raw ToolLoopAgent directly if you prefer — the wrappers are convenience, not a separate runtime.

Example: Dual Setup

const useCli = process.env.USE_CLI_AGENTS === "1";

export const claude = useCli
  ? new ClaudeCodeAgent({
      model: "claude-opus-4-6",
      dangerouslySkipPermissions: true,
    })
  : new AnthropicAgent({
      model: "claude-opus-4-6",
      tools,
      instructions: "You are a careful planner.",
      stopWhen: stepCountIs(40),
    });

Next Steps