Composable Architecture
Mix and match runtimes, state stores, and LLM adapters. Use pre-built components or build your own execution loop.
Swap runtimes, state stores, and LLM providers without changing your agent code.
import { defineAgent, defineTool } from '@helix-agents/core';
import { JSAgentExecutor, InMemoryStateStore, InMemoryStreamManager } from '@helix-agents/sdk';
import { VercelAIAdapter } from '@helix-agents/llm-vercel';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
// Define a tool
const searchTool = defineTool({
name: 'search',
description: 'Search the web for information',
inputSchema: z.object({
query: z.string().describe('Search query'),
}),
outputSchema: z.object({
results: z.array(z.string()),
}),
execute: async ({ query }) => ({
results: [`Results for: ${query}`],
}),
});
// Define an agent
const ResearchAgent = defineAgent({
name: 'researcher',
systemPrompt: 'You are a helpful research assistant.',
tools: [searchTool],
outputSchema: z.object({
summary: z.string(),
sources: z.array(z.string()),
}),
llmConfig: {
model: openai('gpt-4o'),
},
});
// Create infrastructure and execute
const executor = new JSAgentExecutor(
new InMemoryStateStore(),
new InMemoryStreamManager(),
new VercelAIAdapter()
);
const handle = await executor.execute(ResearchAgent, 'Research AI agents');
// Stream results
for await (const chunk of await handle.stream()) {
if (chunk.type === 'text_delta') {
process.stdout.write(chunk.delta);
}
}
const result = await handle.result();
console.log(result.output);| Package | Description |
|---|---|
| @helix-agents/core | Types, interfaces, pure orchestration functions |
| @helix-agents/sdk | Quick-start bundle (core + memory + JS runtime) |
| @helix-agents/runtime-js | In-process JavaScript execution |
| @helix-agents/runtime-temporal | Durable workflows via Temporal |
| @helix-agents/runtime-cloudflare | Edge deployment on Cloudflare |
| @helix-agents/store-memory | In-memory state (development) |
| @helix-agents/store-redis | Redis state (production) |
| @helix-agents/llm-vercel | Vercel AI SDK adapter |
| @helix-agents/ai-sdk | Frontend integration for AI SDK UI |
Helix Agents is built around composability. Every major component is an interface:
You can use the pre-built implementations, or implement the interfaces yourself. The core package provides pure functions like planStepProcessing() and buildMessagesForLLM() that you can compose into your own execution loop.
Use pre-built or build your own - the choice is yours.