web-search
web-search -- DuckDuckGo Search via ddgr
ddgr is a command-line utility that searches DuckDuckGo from the terminal. It's privacy-aware (no user data collection, Do Not Track enabled by default) and requires only Python 3.8+.
Installation
If ddgr is not found on the system, install it before searching.
macOS (Homebrew):
brew install ddgr
pip (cross-platform):
pip3 install ddgr
To check if it's installed:
More from ysm-dev/skills
duckdb-cli
Query and analyze data using the DuckDB CLI. Use when the user needs to run SQL queries, analyze CSV/Parquet/JSON files, create or query databases, export data, or perform any ad-hoc data analysis from the command line. Triggers include requests to "query a file", "analyze data", "run SQL", "read a CSV/Parquet/JSON", "create a database", "export to CSV/Parquet", or any data analysis task that benefits from SQL.
18wachi
Install, configure, and use the wachi CLI to monitor any URL for new content and get notifications via 90+ services (Slack, Discord, Telegram, email, etc.). Use when the user wants to: (1) subscribe to web pages, blogs, YouTube channels, or RSS feeds for change notifications, (2) set up URL monitoring with wachi sub/check/ls commands, (3) configure notification channels via apprise URLs, (4) schedule periodic checks with cron, (5) troubleshoot wachi errors or configuration, or (6) understand how wachi detects changes (RSS auto-discovery, LLM-based CSS selectors).
5ddgr
Search the web using ddgr (DuckDuckGo from the terminal) to find current information, documentation, error solutions, package versions, API references, or any real-world data. Use this skill whenever the user asks to look something up online, search the web, find recent information, check documentation, research a topic, troubleshoot an error message, or needs any data that might not be in your training set. Also use it when you need to verify facts, find URLs, or when a task would clearly benefit from current web information even if the user didn't explicitly ask you to search. Trigger on phrases like "search for", "look up", "what's the latest", "find me", "google", "how do I fix this error", or any request for up-to-date information.
3findweb
Google search from the command line using findweb. Use this skill whenever you need to search the web for information — looking up documentation, researching error messages, finding API references, checking current facts, comparing libraries, or answering questions that require up-to-date information. Trigger this proactively whenever a task would benefit from a web search, even if the user didn't explicitly ask you to search. Also use this when the user asks you to install or set up findweb.
3csv-analyzer
Analyze and process large CSV files (1M+ rows) using DuckDB and Polars. Use when the user asks to analyze, query, filter, aggregate, join, or transform CSV data. Triggers on requests like "analyze this CSV", "query CSV file", "filter large dataset", "aggregate CSV data", "join CSV files", "CSV statistics", or any data analysis task involving CSV/TSV/Parquet files.
2web-scraper
Use when the user wants to scrape, crawl, or extract data from a website or URL. Triggers on: 'scrape this site', 'get data from this URL', 'crawl this page', 'extract data from the web', 'pull data from this link', 'download data from this website', 'I need data from [URL]', or any request involving collecting structured data from a web page. Also triggers when the user provides a URL and asks for its data in CSV, JSON, or other tabular formats. Always load the agent-browser skill alongside this one.
2