Ollama
⭐ 168.6k stars| Repository | ollama/ollama |
| Category | infra |
| Difficulty | beginner |
| Status | active |
| Tags | local serving open-source llama mistral |
| Website | https://ollama.com |
Review
The easiest way to run open-source LLMs locally. One command to download and run Llama, Mistral, Qwen, and more. Best for local development, privacy-sensitive use cases, and experimentation.
Use Cases
- local-development
- serving