rlama

Installation
SKILL.md

RLAMA - Local RAG System

RLAMA (Retrieval-Augmented Language Model Adapter) provides fully local, offline RAG for semantic search over your documents.

When to Use This Skill

  • Building knowledge bases from local documents
  • Searching personal notes, research papers, or code documentation
  • Document-based Q&A without sending data to the cloud
  • Indexing project documentation for quick semantic lookup
  • Creating searchable archives of PDFs, markdown, or code files

Prerequisites

RLAMA requires Ollama running locally:

# Verify Ollama is running
ollama list
Related skills

More from tdimino/claude-code-minoan

Installs
39
GitHub Stars
29
First Seen
Feb 21, 2026