hallucinated-packages-anti-pattern

Installation
SKILL.md

Hallucinated Packages Anti-Pattern

Severity: Critical

Summary

AI models hallucinate non-existent software packages at rates of 5-21%. Attackers exploit this through slopsquatting: registering hallucinated package names with malicious code. Developers installing AI-suggested packages without verification execute attacker code, leading to malware execution, credential theft, and system compromise. This AI-specific supply chain attack exploits the trust gap between AI suggestions and package verification.

The Anti-Pattern

Never install AI-suggested packages without verifying existence, legitimacy, and reputation in official registries.

BAD Code Example

# An AI model generates the following code snippet and instruction:
# "To handle advanced image processing, you should use the `numpy-magic` library.
# First, install it using pip:"
#
Related skills
Installs
8
GitHub Stars
4
First Seen
Jan 20, 2026