Description
Describe the bug
Letta latest version does not work with OLLAMA. I cannot connect to openAI as there are limitations on my side.
It has become a show stopper.
Please describe your setup
-
How are you running Letta?
- Docker
-
Describe your setup
- What's your OS (Windows/MacOS/Linux)?
- MAC
- What is your
docker run ...
command (if applicable) - docker run
-v ~/.letta/.persist/pgdata:/var/lib/postgresql/data
-v /Users/rmuraly/ttk:/tiktoken_cache
-p 8283:8283
-e OLLAMA_BASE_URL="http://host.docker.internal:11434"
-e TIKTOKEN_CACHE_DIR="/tiktoken_cache"
letta/letta:latest
ERRORS
"/projects/letta/create_agent.py", line 6, in
genie = client.agents.create(
^^^^^^^^^^^^^^^^^^^^^
File "/.pyenv/versions/3.11.9/lib/python3.11/site-packages/letta_client/agents/client.py", line 458, in create
raise ApiError(status_code=_response.status_code, body=_response_json)
letta_client.core.api_error.ApiError: status_code: 500, body: {'detail': "Client error '404 Not Found' for url 'http://host.docker.internal:11434/models'\nFor more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404"}
Additional context
Add any other context about the problem here.
- LLM Model : Mistral:latest
- Embedding Model : nomic-embed-text:latest
Agent File (optional)
Please attach your .af
file, as this helps with reproducing issues.
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run Letta with local LLMs, please provide the following information:
- The exact model you're trying to use (e.g.
dolphin-2.1-mistral-7b.Q6_K.gguf
) - The local LLM backend you are using (web UI? LM Studio?)
- Your hardware for the local LLM backend (local computer? operating system? remote RunPod?)