Skip to content

Feature/ollama support #98

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

hridesh-net
Copy link
Contributor

@hridesh-net hridesh-net commented May 2, 2025

Ollama Support

This PR Includes Ollama support for local system along side OpenAI.

Note: This PR has a code structure change in backend/core/


Summary by mrge

Added support for using Ollama as a local LLM provider alongside OpenAI, allowing users to choose between providers in the backend.

  • New Features
    • Added Ollama client and provider selection logic.
    • Updated environment variables to support Ollama configuration.
    • Refactored backend code to support multiple LLM providers.

@orhanrauf
Copy link
Contributor

Awesome @hridesh-net! This looks great. Can you check what's going on in the ruff error? I'll figure out he issues in the backend tests meanwhile.

@orhanrauf
Copy link
Contributor

Hey @hridesh-net! We introduced a merge conflict as we added some configs as well. Just solved them.

The backend tests will continue failing as we're working on an overhaul (prioritizing speed of delivery over these tests for the coming 2-3 days).

Let me know when you took a look at the Ruff errors and we can try to move this forward!!

@hridesh-net
Copy link
Contributor Author

@orhanrauf sure I'll resolve these Rauff errors today

@luca-de-dominicis
Copy link

luca-de-dominicis commented Jun 11, 2025

Any updates on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants