mkdir my-agent && cd my-agentnpm init -ynpm install @openharness/core @ai-sdk/openai
Set your API key:
export OPENAI_API_KEY=sk-...
2
Define the agent
Create an agent.ts file:
agent.ts
import { Agent, createFsTools, createBashTool, NodeFsProvider, NodeShellProvider,} from "@openharness/core";import { openai } from "@ai-sdk/openai";const fsTools = createFsTools(new NodeFsProvider());const { bash } = createBashTool(new NodeShellProvider());const agent = new Agent({ name: "dev", model: openai("gpt-5.4"), systemPrompt: "You are a helpful coding assistant.", tools: { ...fsTools, bash }, maxSteps: 20,});
3
Run the agent
Add a simple loop that streams the agent’s output:
agent.ts
import type { ModelMessage } from "ai";let messages: ModelMessage[] = [];for await (const event of agent.run(messages, "What files are in this directory?")) { switch (event.type) { case "text.delta": process.stdout.write(event.text); break; case "tool.start": console.log(`\nCalling ${event.toolName}...`); break; case "done": messages = event.messages; break; }}
For interactive conversations with automatic compaction and retry, wrap the agent in a Session:
import { Session } from "@openharness/core";const session = new Session({ agent, contextWindow: 128_000,});// First turnfor await (const event of session.send("List all TypeScript files")) { if (event.type === "text.delta") process.stdout.write(event.text);}// Second turn — session remembers the conversationfor await (const event of session.send("Now refactor the largest one")) { if (event.type === "text.delta") process.stdout.write(event.text);}