Skip to content

Helix AgentsComposable AI Agents for TypeScript

Sessions, runs, durable HITL, and stateless suspension across every runtime. Swap runtimes, state stores, and LLM providers without changing your agent code.

Migrating from v6?

The branch omnara/stateless-suspension-redesign is the v7 release train. v7 reshapes the framework around stateless suspension — the runLoop exits at every HITL boundary instead of holding in-memory promises, and resume creates a fresh execution that reads suspension context from the state store. See the v6 → v7 upgrade guide for the full migration walkthrough including breaking changes per package, storage migrations (V8/V9 D1, V5 DO, V5 Postgres), and code migration examples.

Quick Example

typescript
import { defineAgent, defineTool } from '@helix-agents/core';
import { JSAgentExecutor, InMemoryStateStore, InMemoryStreamManager } from '@helix-agents/sdk';
import { VercelAIAdapter } from '@helix-agents/llm-vercel';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';

// Define a tool
const searchTool = defineTool({
  name: 'search',
  description: 'Search the web for information',
  inputSchema: z.object({
    query: z.string().describe('Search query'),
  }),
  outputSchema: z.object({
    results: z.array(z.string()),
  }),
  execute: async ({ query }) => ({
    results: [`Results for: ${query}`],
  }),
});

// Define an agent
const ResearchAgent = defineAgent({
  name: 'researcher',
  systemPrompt: 'You are a helpful research assistant.',
  tools: [searchTool],
  outputSchema: z.object({
    summary: z.string(),
    sources: z.array(z.string()),
  }),
  llmConfig: {
    model: openai('gpt-4o'),
  },
});

// Create infrastructure and execute
const executor = new JSAgentExecutor(
  new InMemoryStateStore(),
  new InMemoryStreamManager(),
  new VercelAIAdapter()
);

const handle = await executor.execute(ResearchAgent, 'Research AI agents');

// Stream results
for await (const chunk of await handle.stream()) {
  if (chunk.type === 'text_delta') {
    process.stdout.write(chunk.delta);
  }
}

const result = await handle.result();
console.log(result.output);

Packages

Core & SDK

PackageDescription
@helix-agents/coreTypes, interfaces, pure orchestration functions
@helix-agents/sdkQuick-start umbrella (core + memory + JS runtime)

Runtimes

PackageDescription
@helix-agents/runtime-jsIn-process JavaScript execution
@helix-agents/runtime-temporalDurable workflows via Temporal
@helix-agents/runtime-cloudflareCloudflare DO + Workflows (HITL on both paths in v7)
@helix-agents/runtime-dbosPostgres-backed durable workflows via DBOS

State stores

PackageDescription
@helix-agents/store-memoryIn-memory state (development)
@helix-agents/store-redisRedis state + streams (production)
@helix-agents/store-postgresPostgreSQL state (production, runtime-agnostic)
@helix-agents/store-cloudflareD1 + Durable Objects (Cloudflare deployments)

LLM, memory, embeddings

PackageDescription
@helix-agents/llm-vercelVercel AI SDK adapter
@helix-agents/memoryIn-process semantic memory (dev)
@helix-agents/memory-redisRedis-backed semantic memory (production)
@helix-agents/memory-cloudflareD1 + Vectorize + Queues memory store
@helix-agents/embedding-vercelVercel AI SDK embedding adapter
@helix-agents/embedding-cloudflareCloudflare Workers AI embeddings

Frontend, server, observability

PackageDescription
@helix-agents/ai-sdkFrontend integration for the Vercel AI SDK
@helix-agents/agent-serverHTTP server for hosting agents remotely (11 v7 routes)
@helix-agents/tracing-langfuseLangfuse tracing integration

Workspaces

PackageDescription
@helix-agents/workspace-memoryIn-memory workspace provider (dev)
@helix-agents/workspace-local-bashLocal bash workspace provider

Philosophy

Helix Agents is built around composability. Every major component is an interface:

  • Runtime: How agent loops execute (JS, Temporal, CF DO, CFW Workflows, DBOS)
  • State Store: Where state persists (Memory, Redis, Postgres, D1, DO SQLite)
  • Stream Manager: How events flow (Memory, Redis pub/sub, Durable Objects)
  • LLM Adapter: Which model to use (Vercel AI SDK, custom)
  • Workspace Provider: How agents touch a filesystem/shell (memory, local-bash, Cloudflare Sandbox/Filestore)

You can use the pre-built implementations, or implement the interfaces yourself. The core package provides pure functions like runStepIteration(), buildMessagesForLLM(), and executeCompanionToolDispatch() that you can compose into your own execution loop.

Use pre-built or build your own - the choice is yours.

Released under the MIT License.