local-llm-privacy
Local LLM Privacy Skill
Handle AI tasks involving private or sensitive data by routing them to a local Ollama model instead of the cloud. This protects user data by never sending it to external APIs.
Step 1 — Confirm the Privacy Requirement
Before doing anything, acknowledge why local processing matters here. Say something like:
"Since this data is sensitive, I'll try to handle it using a local model on your machine so nothing gets sent to the cloud."
Then proceed to Step 2.
Step 2 — Detect Ollama and Available Models
Run the following bash commands to check for Ollama:
More from gaojizhou/skills
phone-agent
Use AI AutoGLM Phone Agent for automated mobile device control. Suitable for tasks requiring mobile phone automation, such as APP automated testing, data collection, UI interaction, etc. Supports controlling the mobile interface through natural language instructions to implement operations such as clicking, sliding, inputting, and screenshotting.
11totp
>
1epub
Use this skill whenever the user wants to read, parse, extract content from, modify, or otherwise process an .epub file. Triggers include any mention of ".epub", "ebook", "epub file", or requests to extract chapters, table of contents, text, images, or metadata from an ebook. Also use when the user wants to convert epub content to another format, inspect epub structure, or edit epub files. Since epub files are ZIP archives in disguise, this skill uses a reliable unzip-then-parse approach that always works. Use this skill even for seemingly simple epub tasks like "read this epub" or "show me the chapters" — the extraction workflow is always needed.
1