crawl4ai
crawl4ai
High-performance web crawler with intelligent chunking. Crawls web pages and extracts content as markdown using LLM-based skeleton planning.
Commands
crawl_url (alias: webCrawl)
Crawl a web page with native workflow execution and LLM-based intelligent chunking.
Parameters:
| Parameter | Type | Default | Description |
|---|---|---|---|
url |
str | - | Target URL to crawl (required) |
action |
str | "smart" | Action mode: "smart", "skeleton", "crawl" |
fit_markdown |
bool | true | Clean and simplify markdown output |
max_depth |
int | 0 | Maximum crawling depth (0=single page) |
return_skeleton |
bool | false | Also return document skeleton (TOC) |
chunk_indices |
list[int] | - | List of section indices to extract |
More from tao3k/omni-dev-fusion
software_engineering
Use when analyzing project architecture, exploring codebase structure, understanding system design, reviewing code patterns, or navigating modular components.
18python_engineering
Use when linting Python code, formatting with ruff/black, running pytest tests, type checking with pyright, or modernizing Python 3.12+ standards.
18code_tools
Use when searching code by structure or meaning, analyzing code patterns, finding class or function definitions, or exploring codebase architecture.
16rust_engineering
Use when analyzing Rust project structure, managing Cargo dependencies, building and testing Rust projects, or generating Rust code.
16git
Use when committing code, managing branches, pushing to remote, creating pull requests, or performing version control operations. Conforms to docs/reference/skill-routing-value-standard.md.
15researcher
Use when analyzing repositories, conducting deep research on codebases, performing architecture reviews, or exploring large projects from a git URL.
15