algo-seo-crawl

Installation
SKILL.md

Web Crawler

Overview

A web crawler systematically traverses web pages by discovering URLs, fetching content, parsing HTML, and storing results. Uses BFS or priority-based frontier management. Performance is I/O-bound, typically limited by politeness constraints rather than compute.

When to Use

Trigger conditions:

  • Building a site audit tool to discover all pages and their link structure
  • Collecting structured data from websites at scale
  • Mapping site architecture for SEO analysis

When NOT to use:

  • When you need data from a single API endpoint (use HTTP client directly)
  • When a sitemap.xml provides all needed URLs (parse sitemap instead)

Algorithm

Related skills

More from asgard-ai-platform/skills

Installs
19
GitHub Stars
190
First Seen
Apr 10, 2026