Skip to content

Pull requests: containers/ramalama

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

fix: vLLM serving and model mounting
#1571 opened Jun 20, 2025 by kush-gupt Loading…
Adds the ability to pass files to ramalama run
#1570 opened Jun 19, 2025 by ieaves Loading…
vllm containers
#1560 opened Jun 18, 2025 by afazekas Loading…
Remove libexec files
#1504 opened Jun 11, 2025 by ericcurtin Loading…
Start process of moving python-ramalama to ramalama
#1498 opened Jun 10, 2025 by smooge Loading…
cpu type rpc worker
#1485 opened Jun 9, 2025 by afazekas Loading…
Turn on GGML_VULKAN by default
#1434 opened May 22, 2025 by ericcurtin Draft
python package fixes
#1411 opened May 15, 2025 by ericcurtin Draft
Some initial proxy code
#1337 opened May 4, 2025 by ericcurtin Loading…
[PoC] RAG support for ramalama client command
#1228 opened Apr 19, 2025 by Tojaj Loading…
Add support for kserve
#877 opened Feb 24, 2025 by rhatdan Loading…
ProTip! Find all pull requests that aren't related to any open issues with -linked:issue.