research-yt
research-yt
LLM-driven research over a YouTube channel. The skill scripts handle deterministic I/O (listing, fetching, transcribing); the synthesis is done by you (Claude) reading the script outputs.
Inputs (collect before starting)
Always ask the user for:
- Channel(s) — one or more handles, URLs, or @names (e.g.
@veritasium, or@veritasium, @3blue1brown). Multiple channels are encouraged when the user wants comparison or broader coverage. - Topic — research focus (e.g. "quantum computing"). Used as the topic-file slug.
- Specific video (optional) — single URL/ID to scope to one video instead of the channel(s).
- Output format — markdown (default) or PDF, asked at runtime.
- How many videos per channel to consider — defaults to newest 50 per channel; ask if the user wants more/less. Be explicit that the limit is per-channel so the user can budget their time.
Don't guess any of these. If the user names multiple channels but not the topic (or vice versa), ask before running anything — multi-channel runs cost N× the metadata fetch time.
Workspace + artifact layout
Two distinct things live in the current working directory:
More from timbroddin/skills
app-store-aso
Generate optimized Apple App Store metadata recommendations with ASO best practices. Use this skill when analyzing app listings, optimizing metadata (title, subtitle, description, keywords), performing competitive analysis, or validating App Store listing requirements. Triggers on queries about App Store optimization, metadata review, or screenshot strategy.
30youtube-research
Deep LLM-driven research over one or more YouTube channels' videos. Lists each channel's catalog, filters videos by topic relevance, transcribes only the relevant ones, then synthesizes a single cross-channel research document with timestamped citations. Use when the user runs /youtube-research or asks to research, summarize, analyze, compare, or extract topics from a YouTube channel, multiple channels, or a specific YouTube video. Subtitles-first via yt-dlp, falls back to local Whisper (mlx-whisper, whisper.cpp, or openai-whisper) for videos without subs. Uses a hidden workspace at ./.youtube-research/ for intermediate artifacts (channel indexes, transcripts) and writes the final research artifact to the current working directory. Asks the user before deleting the workspace at the end.
3swift-missing-translations
Audit a Swift/SwiftUI project's Localizable.xcstrings (and AppShortcuts.xcstrings) for missing translations, compute per-language coverage, find raw source-language literals still hard-coded in UI code, and bulk-translate the gaps. Source language is read from the catalog — works for any source language (en, nl, de, …).
1