ollama-local

Installation
SKILL.md

Ollama Local Inference

Run LLMs locally for cost savings, privacy, and offline development.

Quick Start

# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Pull models
ollama pull deepseek-r1:70b      # Reasoning (GPT-4 level)
ollama pull qwen2.5-coder:32b    # Coding
ollama pull nomic-embed-text     # Embeddings

# Start server
ollama serve
Related skills

More from yonatangross/skillforge-claude-plugin

Installs
4
GitHub Stars
170
First Seen
Jan 21, 2026