ai-bias-special-category
Installation
SKILL.md
AI Bias Assessment for Special Category Data
Overview
AI systems can amplify, perpetuate, or introduce bias against protected groups defined by GDPR Art. 9 special categories (race, ethnicity, political opinion, religion, trade union membership, genetic data, biometric data, health, sexual orientation) and by EU equality law (gender, age, disability). The AI Act Art. 10 requires data governance practices for training data that address bias, while Art. 5 prohibits AI-based social scoring. This skill provides the methodology for detecting, measuring, and mitigating bias in AI systems that process or infer special category data, with documentation requirements meeting both GDPR and AI Act obligations.
Art. 9 Special Categories and AI Bias
Direct Processing of Special Category Data
When AI systems directly process Art. 9 data:
| Category | AI Bias Risk | Example |
|---|---|---|
| Racial or ethnic origin | Discrimination in hiring, credit, policing | CV screening penalising names associated with ethnic minorities |
| Political opinions | Political profiling, content suppression | News recommendation amplifying or suppressing political viewpoints |
| Religious beliefs | Service denial, discriminatory targeting | Insurance pricing varying by religious affiliation |
| Trade union membership | Employment discrimination | Performance scoring penalising union activity |
| Genetic data | Genetic discrimination in insurance/employment | Health insurance pricing based on genetic predisposition |
Related skills