kafka-schema-registry
Kafka Schema Registry Skill
Scan a project to identify Kafka applications, extract schemas, generate Terraform for Schema Registry registration, and produce a comprehensive analysis report.
When to Use
Invoke this skill when:
- A user asks to analyze a project for Kafka usage in order to add event schemas or integrate Schema Registry
- A user wants to extract schemas from Kafka producers
- A user wants Terraform to register schemas to Schema Registry
- A user wants to audit Kafka producer/consumer configurations
Deliverables
This skill produces 3 outputs in the target project:
schema-report.md— Full analysis report with findings, risks, and upgrade recommendationsschemas/— Extracted schema files (Avro, JSON Schema, Protobuf) with PII taggingterraform/— Terraform configs using Confluent provider to register schemas
More from confluentinc/agent-skills
confluent-cloud-cdc-tableflow
Set up end-to-end Change Data Capture (CDC) pipelines on Confluent Cloud using Debezium source connectors, Flink for transformation, and Tableflow for data lake integration. Supports JSON_SR, Avro, and Protobuf formats. Handles schemaless topics (plain JSON without SR) and multi-event topics. This skill handles the complete workflow from database to Iceberg/Delta tables. Use this skill when users want to capture database changes and materialize them into Iceberg or Delta Lake tables via Confluent Cloud Tableflow. Trigger phrases include "CDC to Tableflow", "database to Iceberg", "database to Delta Lake", "stream database changes to data lake", "set up Tableflow pipeline", "schemaless topic to Tableflow", or "multi-event topic to Iceberg". Do NOT trigger for general CDC, Debezium, or database replication requests that do not involve Tableflow or Iceberg/Delta Lake as the destination.
12developing-kafka-python-client
Use when the user wants to build a Python Kafka producer or consumer, add Schema Registry to existing Python code, migrate from raw JSON to schema-backed serialization, or scaffold a confluent-kafka-python project for Confluent Cloud or local Docker.
11kafka-streams-programming
Architect, build, and debug Kafka Streams apps (JVM-embedded stream processing). Use when user mentions KStream, KTable, topology, TopologyTestDriver, StreamsBuilder, interactive queries, GlobalKTable, joins/windows/aggregations, or debugging issues (rebalancing, state stores, lag, deserialization errors). Do NOT trigger for Flink, connectors, CDC, or plain producer/consumer.
10