Skip to content

Providers

import {
AnthropicAdapter, OpenAIAdapter, BedrockAdapter,
BrowserAnthropicAdapter, BrowserOpenAIAdapter,
mock,
} from 'agentfootprint';
// Shorthand factories
import { anthropic, openai, ollama, bedrock } from 'agentfootprint';
import { anthropic } from 'agentfootprint';
const provider = anthropic('claude-sonnet-4-20250514');

Uses the @anthropic-ai/sdk. Supports streaming, tool use, and extended thinking.

import { openai } from 'agentfootprint';
const provider = openai('gpt-4o');

Uses the openai SDK. Supports streaming and tool use.

import { bedrock } from 'agentfootprint';
const provider = bedrock('anthropic.claude-sonnet-4-20250514-v1:0');

Uses AWS SDK. Supports Claude on AWS Bedrock. Requires AWS credentials configured via environment or SDK config.

// With explicit credentials
import { BedrockAdapter } from 'agentfootprint';
const provider = new BedrockAdapter({
model: 'anthropic.claude-sonnet-4-20250514-v1:0',
region: 'us-east-1',
});
import { ollama } from 'agentfootprint';
const provider = ollama('llama3');

Connects to local Ollama instance. OpenAI-compatible API.

const provider = mock([
{ content: 'First response' },
{ content: '', toolCalls: [{ id: 'tc1', name: 'search', arguments: { q: 'test' } }] },
{ content: 'Final answer based on tool results' },
]);

Deterministic responses for testing. $0 cost. Each run() consumes the next response.

For browser environments (no Node.js SDK dependency):

const provider = new BrowserAnthropicAdapter({
apiKey: 'sk-ant-...',
model: 'claude-sonnet-4-20250514',
});
const provider = new BrowserOpenAIAdapter({
apiKey: 'sk-...',
model: 'gpt-4o',
});

Claude models support extended thinking — the model reasons through complex problems before responding. agentfootprint captures thinking blocks as thinking stream events:

const provider = anthropic('claude-sonnet-4-20250514');
const agent = Agent.create({ provider })
.system('You are a reasoning assistant.')
.streaming(true)
.build();
await agent.run('Solve this step by step: if x² + 3x - 10 = 0, find x', {
onEvent: (event) => {
if (event.type === 'thinking') {
console.log('[thinking]', event.content);
}
if (event.type === 'token') {
process.stdout.write(event.content);
}
},
});

Automatic cross-family failover:

import { anthropic, openai } from 'agentfootprint';
import { fallbackProvider } from 'agentfootprint/resilience';
const provider = fallbackProvider([
anthropic('claude-sonnet-4-20250514'), // try Claude first
openai('gpt-4o'), // fall back to GPT-4o
]);
interface LLMProvider {
chat(messages: Message[], options?: LLMCallOptions): Promise<LLMResponse>;
chatStream?(messages: Message[], options?: LLMCallOptions): AsyncIterable<LLMStreamChunk>;
}

Implement this interface to connect any LLM.