Skip to content

Chat with your Letta agents over a low-latency voice connection. Advanced voice mode, but with advanced memory.

License

Notifications You must be signed in to change notification settings

letta-ai/letta-voice

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Stateful Voice Agents

This repo show how to use Letta and Livekit to create low-latency voice agents with memory, tool execution, and persistence.

Installation & Setup

First install the basic requirements in a virtual enviornment (Python >= 3.10):

git clone [email protected]:letta-ai/letta-voice.git
cd letta-voice 
pip install -r requirements.txt

For this example, you will need accounts with the following providers:

You will also need to set up the following environment variables (or create a .env file):

LETTA_API_KEY=... # Letta Cloud API key (if using cloud)

LIVEKIT_URL=wss://<YOUR-ROOM>.livekit.cloud # Livekit URL
LIVEKIT_API_KEY=... # Livekit API key
LIVEKIT_API_SECRET=... # Livekit API secret

DEEPGRAM_API_KEY=... # Deepgram API key
CARTESIA_API_KEY=... # Cartesia API key

Connecting Letta to Voice

  1. Run python main.py dev
  2. Go to the Livekit Agents Playground: https://agents-playground.livekit.io/
  3. Chat with your agent

Running with a self-hosted Letta Server

Running Letta

To run Letta, you can either install and run Letta Desktop or run a Letta service with Docker:

docker run \
  -v ~/.letta/.persist/pgdata:/var/lib/postgresql/data \
  -p 8283:8283 \
  -e OPENAI_API_KEY=${OPENAI_API_KEY} \
  letta/letta:latest

See Letta's full quickstart and installation instructions here.

Running ngrok

If you are self-hosting the Letta server locally (at localhost), you will need to use ngrok to expose your Letta server to the internet:

  1. Create an account on ngrok
  2. Create an auth token and add it into your CLI
ngrok config add-authtoken <YOUR_AUTH_TOKEN> 
  1. Point your ngrok server to your Letta server:
ngrok http http://localhost:8283

Now, you should have a forwarding URL like https://<YOUR_FORWARDING_URL>.ngrok.app, which you can pass in as the base_url to openai.LLM.with_letta(...).

About

Chat with your Letta agents over a low-latency voice connection. Advanced voice mode, but with advanced memory.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages