Skip to content

Ollama

⭐ 168.6k stars
Repositoryollama/ollama
Categoryinfra
Difficultybeginner
Statusactive
Tagslocal serving open-source llama mistral
Websitehttps://ollama.com

Review

The easiest way to run open-source LLMs locally. One command to download and run Llama, Mistral, Qwen, and more. Best for local development, privacy-sensitive use cases, and experimentation.

Use Cases

  • local-development
  • serving

Curated with care for the AI developer community