ollama

Installation
SKILL.md

Ollama

Ollama makes running LLMs locally as easy as docker run. 2025 updates include Windows/AMD support, Multimodal input, and Tool Calling.

When to Use

  • Local Development: Coding without wifi or API costs.
  • Privacy: Processing sensitive documents on-device.
  • Integration: Works with LangChain, LlamaIndex, and Obsidian natively.

Core Concepts

Modelfile

Docker-like file to define a custom model (System prompt + Base model).

Related skills
Installs
3
GitHub Stars
7
First Seen
Feb 10, 2026