The Letta provider is available in the @letta-ai/vercel-ai-sdk-provider
module. You can install it with
npm i @letta-ai/vercel-ai-sdk-provider
You can import the default provider instance letta
from @letta-ai/vercel-ai-sdk-provider
:
import { letta } from '@letta-ai/vercel-ai-sdk-provider';
Using Letta Cloud (https://api.letta.com)
Create a file called .env.local
and add your API Key
LETTA_API_KEY=<your api key>
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: lettaCloud('your-agent-id'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
Local instances (http://localhost:8283)
import { lettaLocal } from '@letta-ai/vercel-ai-sdk-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: lettaLocal('your-agent-id'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
import { createLetta } from '@letta-ai/vercel-ai-sdk-provider';
import { generateText } from 'ai';
const letta = createLetta({
baseUrl: '<your-base-url>',
token: '<your-access-token>'
})
const { text } = await generateText({
model: letta('your-agent-id'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
The vercel-ai-sdk-provider
extends the letta node client, you can access the operations directly by using lettaCloud.client
or lettaLocal.client
or your custom generated letta.client
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
lettaCloud.agents.list();
Check out our simple example using nextjs to stream letta messages to your frontend in examples/letta-ai-sdk-example