cpu-only
Here are 7 public repositories matching this topic...
🦙 chat-o-llama: A lightweight yet powerful web interface for Ollama with markdown rendering, syntax highlighting, and intelligent conversation management. Zero external dependencies, perfect for privacy-focused local AI development.
-
Updated
Jun 16, 2025 - Shell
An LLM-based content moderator. Firefox extension to block webpages unrelated to work, based on page title and URL. Local LLMs with Ollama and Langchain to ensure your browsing history never leaves your device, for complete privacy. Google Gemini also supported.
-
Updated
Dec 12, 2024 - Python
Image Classification with On-Device Inference, built with Flutter, AI model runs on mobile cpu
-
Updated
Jan 29, 2025 - Dart
A new one shot face swap approach for image and video domains - version tailored to work on CPU
-
Updated
Aug 20, 2024 - Python
Face Detection service, super fast inference with a nano model
-
Updated
Jan 26, 2025 - Python
Chat-O-Llama is a user-friendly web interface for managing conversations with Ollama, featuring persistent chat history. Easily set up and start your chat sessions with just a few commands. 🐙💻
-
Updated
Jun 20, 2025 - HTML
Improve this page
Add a description, image, and links to the cpu-only topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the cpu-only topic, visit your repo's landing page and select "manage topics."