ollama-integration

Installation
SKILL.md

Ollama Integration

Integrate Ollama for local LLM inference in TypeScript applications. Ollama provides a simple API for running language models locally.

When to Apply

Use this skill when:

  • Running LLMs locally without cloud APIs
  • Generating text or embeddings with Ollama
  • Building AI features that need to work offline
  • Implementing RAG pipelines with local models
  • Testing AI applications without API costs

Prerequisites

Install Ollama

# macOS
Related skills

More from constructive-io/constructive-skills

Installs
6
First Seen
Feb 27, 2026