local-llm-privacy

Installation
SKILL.md

Local LLM Privacy Skill

Handle AI tasks involving private or sensitive data by routing them to a local Ollama model instead of the cloud. This protects user data by never sending it to external APIs.


Step 1 — Confirm the Privacy Requirement

Before doing anything, acknowledge why local processing matters here. Say something like:

"Since this data is sensitive, I'll try to handle it using a local model on your machine so nothing gets sent to the cloud."

Then proceed to Step 2.


Step 2 — Detect Ollama and Available Models

Run the following bash commands to check for Ollama:

Related skills
Installs
1
GitHub Stars
43
First Seen
Mar 14, 2026