technological-pedagogical-content-knowledge-developer
Technological Pedagogical Content Knowledge Developer
What This Skill Does
Takes a description of what a teacher is teaching, the technology they are integrating, and their background, then diagnoses their technological pedagogical content knowledge gaps and produces a development plan. TPACK (Mishra & Koehler, 2006) extends Shulman's PCK framework to account for technology: just as knowing a subject and knowing how to teach it are distinct capabilities, knowing how a technology works and knowing how to use it to teach a specific subject well to specific students is a third, distinct capability. A teacher who is technically proficient with an AI tool may still not know whether that tool's representation of historical causation is epistemically accurate, or whether using it for student writing undermines the metacognitive development the writing task was designed to produce. This skill addresses those intersections. It is most powerful when run after the pedagogical-content-knowledge-developer — TPACK gaps are harder to diagnose without first understanding PCK gaps, because the technology question is always "does this tool help or hinder the teaching of this specific content to these specific students?" and that question requires PCK to answer. The skill includes specific guidance for AI tools, which present distinct challenges: AI outputs may be fluent but epistemically incorrect, AI assistance may create dependency rather than capability, and students using AI for thinking tasks may perform the task without doing the thinking the task was designed to develop. These are TPACK questions, not just technology questions, and they require the teacher to understand both the content and the pedagogy to navigate well.
Evidence Foundation
Mishra & Koehler (2006) proposed TPACK as a framework for understanding the knowledge teachers need to integrate technology effectively, building on Shulman's (1986) PCK. They identified seven knowledge domains at the intersections of content (C), pedagogy (P), and technology (T): content knowledge, pedagogical knowledge, technological knowledge, and the four intersections — PCK, TCK (technological content knowledge), TPK (technological pedagogical knowledge), and the full TPACK (the intersection of all three). The critical insight is that technology integration is not a generic skill: the right use of a simulation for teaching photosynthesis requires different knowledge than the right use of a simulation for teaching market dynamics, even if the simulation platform is identical. Technology integration that is content-blind — "use this tool for engagement" — is pedagogically empty.
Koehler & Mishra (2009) extended the framework, arguing that effective technology integration requires understanding the "wicked problem" of how technology, content, and pedagogy interact in specific contexts. There are no general solutions — only specific solutions for specific intersections. A teacher who has learned to use an AI tool effectively for writing scaffolding in English does not automatically know how to use the same tool for scientific explanation in biology, because the content demands, the epistemic standards, and the learning goals are different.
Voogt et al.'s (2013) review found consistent evidence that TPACK is a distinct and teachable construct, but noted significant variation in how it is measured and developed across studies. Chai, Koh & Tsai (2013) reviewed quantitative TPACK measures and found that self-report instruments often overestimate teacher TPACK — teachers rate their technology integration confidence higher than their actual ability to make content-specific technology decisions in practice. This suggests that TPACK development requires practice-based feedback, not just self-assessment.
Angeli & Valanides (2009) argued that TPACK should be treated as a unique body of knowledge that is more than the sum of its parts — not just the intersection of three separate domains but a qualitatively distinct form of knowing that emerges from experience with specific technology-content-pedagogy combinations. This reinforces the need for topic-specific TPACK development rather than generic technology training.
Hattie's (2009) meta-analysis found that technology in education has highly variable effects — effect sizes range from strongly negative to strongly positive depending on implementation. The meta-analytic average (d = 0.31) is modest, but the variation is enormous. This is precisely the TPACK insight: it is not the technology that determines outcomes but the teacher's knowledge of how to deploy it for specific content with specific students. A technology used well for the right content at the right time produces strong learning gains; the same technology used without TPACK may produce no gain or active harm.
Selwyn (2016) provides a necessary critical counterweight to technology enthusiasm in education. Many claims about educational technology are made by vendors rather than researchers, and the evidence base for specific tools is often thin or conflicted. Selwyn argues that the burden of proof that a technology improves learning for this content with these students should sit with the teacher using it, not with the marketing literature. This critical stance is part of TPACK: the disposition to ask "does this actually help my students learn this specific content?" rather than assuming technology is beneficial by default.
More from garethmanning/claude-education-skills
intelligent-tutoring-dialogue-designer
Script a multi-turn tutoring dialogue with branching responses for anticipated student difficulties. Use when designing AI tutors, chatbot interactions, or structured one-to-one support scripts.
15scaffolded-task-modifier
Modify a classroom task with language scaffolds that preserve cognitive demand for EAL learners. Use when adapting existing tasks for students at different English proficiency levels.
14experiential-learning-cycle-designer
Structure a direct experience into a full learning cycle with concrete experience, reflection, and conceptual transfer. Use when planning field trips, simulations, or practical tasks.
14gap-analysis-from-student-work
Analyse student work against criteria to identify specific gaps between current performance and learning objectives. Use when reviewing submissions, planning feedback, or diagnosing learning needs.
13backwards-design-unit-planner
Plan a unit using backwards design from desired outcomes through assessment evidence to learning activities. Use when starting a new unit or redesigning an existing one from standards.
13dual-coding-designer
Design a visual complement to verbal content using dual coding principles for stronger encoding. Use when creating slides, diagrams, posters, or visual explanations of complex concepts.
12