-
Notifications
You must be signed in to change notification settings - Fork 321
Feature/ollama support #98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Awesome @hridesh-net! This looks great. Can you check what's going on in the ruff error? I'll figure out he issues in the backend tests meanwhile. |
Hey @hridesh-net! We introduced a merge conflict as we added some configs as well. Just solved them. The backend tests will continue failing as we're working on an overhaul (prioritizing speed of delivery over these tests for the coming 2-3 days). Let me know when you took a look at the Ruff errors and we can try to move this forward!! |
@orhanrauf sure I'll resolve these Rauff errors today |
Any updates on this? |
Ollama Support
This PR Includes Ollama support for local system along side OpenAI.
Note: This PR has a code structure change in backend/core/
Summary by mrge
Added support for using Ollama as a local LLM provider alongside OpenAI, allowing users to choose between providers in the backend.