Reimagining AI in Social Studies: Four Educator Archetypes and the Path Forward
Andy Szeto
“The notion that having a laptop computer or hand-held device for every student will make her or him smarter, or even more knowledgeable is pedagogically vapid.”
Michael Fullan (Fullan, 2011, p. 15)
Michael Fullan’s 2011 paper Choosing the Wrong Drivers for Whole System Reform offered a powerful caution that still rings true today. Schools often rush to adopt new technology without the deeper instructional shifts needed to make it meaningful. Early in my teaching career, I saw this firsthand with the rollout of interactive whiteboards. The promise was exciting and the investment was significant, but the implementation fell short. Without the right training, support, and connection to instructional goals, many of those boards became little more than digital display tools. They were not used the way they were intended, and the opportunity to transform teaching practice was largely missed.
We are at another crossroads. Just as interactive whiteboards once promised transformation but too often delivered status quo, AI now arrives with the potential to reshape how students think, write, and engage with civic life. Fullan reminds us that real, lasting change does not come from devices or tools alone. It comes from building instructional capacity, strengthening relationships, and creating coherent systems. In the age of AI, his warning is more relevant than ever. If we adopt these tools without clear purpose or thoughtful pedagogy, we risk repeating old mistakes with even more powerful technology.
A new chapter: AI in the social studies classroom
In 2025, two major federal initiatives signaled a nationwide commitment to integrating generative artificial intelligence into education and educator development. In April, the White House issued Executive Order 14277, Advancing Artificial Intelligence Education for American Youth, which focused on expanding educator capacity and increasing student access to AI tools (Federal Register, 2025). Just months later, it released Winning the Race: America’s AI Action Plan, a strategy outlining more than 90 actions focused on expanding AI education, supporting teacher training, and ensuring equitable integration across learning environments (White House, 2025). While neither document names social studies directly, their emphasis on “fostering a culture of innovation and critical thinking” (Federal Register, 2025) has clear implications for K–12 social studies classrooms. Guidance from organizations such as the National Council for the Social Studies (NCSS), the International Society for Technology in Education (ISTE), and Common Sense Media reinforces the need for educators to critically evaluate tools, protect student data, and promote responsible use of generative AI. However, national ambitions alone won’t shape daily classroom practice – teachers will. And to do that effectively, we must start by understanding where each educator is on the journey. Some are skeptical of AI’s role in civic learning. Others are experimenting with basic tools. A few are already transforming their practice in bold, creative ways.
This article introduces a four-archetype framework: Skeptic, Novice, Designer, and Trailblazer, to capture the diverse ways social studies educators are engaging with AI, each reflecting a distinct mindset and stage of instructional readiness. Each archetype is grounded in practical, student-facing classroom examples designed to support critical thinking, historical inquiry, and civic reasoning in an AI-powered world.
Cautious AI skeptic
“I want students to wrestle with complexity—not rely on shortcuts. AI worries me because it might undercut the deep analysis and civic responsibility we’re trying to teach.”
Skeptics approach AI with deep caution, grounded in a firm belief that students should be thinkers, not just content consumers. They worry that AI tools may undercut historical reasoning, obscure authorship, or dilute opportunities for authentic civic learning. For these educators, AI is not neutral. They raise valid questions about equity, surveillance, and how easily confident-sounding misinformation can circulate unchecked. Their hesitation is often grounded in research on how students misinterpret digital content and confuse fluency with accuracy, a concern amplified in recent studies on AI-generated misinformation (Wineburg & Ziv, 2024). Yet even skeptical educators recognize the importance of engaging with these tools critically, so students are not left unprepared.
These activities emphasize critique, caution, and civic responsibility, helping students question AI rather than accept it at face value:
- Facilitate an activity where students fact-check AI-generated historical claims using vetted primary sources.
- Guide students to verify an AI-generated historical claim using lateral reading—opening new tabs to cross-check with trusted sources—and reflect on how polished responses can still be misleading (Wineburg & Ziv, 2024).
- Have students use AI to generate a fake historical image or event description, then analyze it using Common Sense Media’s AI literacy principles to identify signs of manipulation and discuss real-world implications (Common Sense Media, 2025).
- Use ChatGPT’s Study Mode to help students unpack a dense primary source, then lead a discussion critiquing how the AI framed key ideas and what it overlooked (Sawchuk, 2025).
“I’ve tested a few AI tools, but I’m still figuring out how to connect them to real learning, especially sourcing, analysis, and classroom discussion.”
Novices are intrigued by AI and willing to try it, but they’re still figuring out where it fits. Their experimentation often centers around one-off tasks, like generating an image for a warm-up or asking ChatGPT to summarize a reading. While eager to explore, they haven’t yet connected AI use to core social studies practices like sourcing, historical inquiry, or civic discourse. According to Hernholm (2025), even teachers who express curiosity about AI still need structured support, especially when it comes to tools, time, and training. As AI for Education (2024) notes, starting with small activities, like brainstorming prompts or using generative tools for warm-ups, helps build confidence without overwhelming teachers new to AI. These early successes lay the foundation for deeper exploration and help novices envision how AI might eventually align with their instructional goals. Structured tools like MagicSchool AI, Claude, Adobe Express, and NotebookLM give these teachers a way to test ideas in real classrooms while building the capacity to move from occasional use to intentional design. When AI is framed as a way to enhance, not replace, core learning goals, novices begin to shift from curiosity to confidence.
These entry points offer low-risk ways to explore AI tools while building confidence and connection to core social studies practices:
- Use AI tools like NotebookLM to reorganize historical sources into thematic clusters, then have students analyze how the AI grouped them and evaluate the accuracy and bias of those groupings (Wasik, 2025).
- Prompt students to use Claude.ai or ChatGPT to generate differing perspectives on a historical event, then evaluate them for bias and omissions.
- Facilitate a role-play simulation using Character.AI, where students question historical figures and fact-check the responses.
- Use Adobe Express to co-create civic posters or infographics with AI-generated draft text, then revise for accuracy and tone.
With the right support, tools, time, and professional learning – these educators begin moving from curiosity to confidence.
“AI gives us new ways to simulate civic life, reimagine debate, and engage students in building—and challenging—systems of power and justice.”
Designers integrate AI with purpose. They go beyond surface-level use to embed it into thoughtful lessons that support historical reasoning, civic writing, and student discourse. These educators treat AI as a tool to elevate, not replace, student learning. They maintain instructional control, designing experiences where students use AI to revise, question, and deepen understanding. They are clear about their instructional goals and use AI as a tool to help students engage more deeply with content. Designers are neither dismissive nor blindly enthusiastic. They see the promise of AI, but they also understand its limits.
Recent research supports this balanced mindset. Clark and van Kessel (2024) found that AI-generated lesson materials often reflect embedded assumptions or miss opportunities for meaningful inquiry. They encourage educators to treat AI as a collaborator that needs to be questioned and shaped, not a neutral source. Similarly, Klein (2025) reported that many AI-generated civics lessons lack depth and fail to promote the kind of student thinking social studies demands. Designers are aware of these limitations. That’s why they stay close to their pedagogical aims and use AI as a tool for design, not a substitute for it.
In the classroom, Designers guide students to use AI purposefully: drafting historical arguments, analyzing civic texts, or refining written responses. They help students question AI outputs and compare them to disciplinary thinking models. They use AI to scaffold participation for multilingual learners or struggling writers, while still expecting students to revise, debate, and cite. In short, Designers make AI useful by keeping it anchored in student learning.
These practices use AI intentionally to deepen historical reasoning, support civic discourse, and elevate student writing:
- Use NotebookLM to create a video overview from source documents, then have students critique its accuracy and revise it to reflect stronger historical thinking (TechCrunch, 2025).
- Use AI to model civic writing, like letters to elected officials or op-eds, followed by analysis of argument strength and tone.
- Support multilingual learners by using AI to generate sentence starters, vocabulary scaffolds, or translated prompts (Szeto, 2024a).
- Ask students to use AI to generate multiple historical perspectives on an event, then evaluate how each aligns with available primary sources and disciplinary thinking (Szeto, 2024b).
”AI lets us simulate debates, test civic arguments, and rethink how students engage with the past and present.”
Trailblazers are reimagining what’s possible with AI. They don’t just use tools, they create new experiences where students build, critique, and explore ideas at the intersection of technology and civic life. Their classrooms are laboratories for inquiry, civic action, and reflection. Trailblazers lead boldly but with intention, staying grounded in social studies goals like justice, democracy, and historical thinking.
These educators often lead professional learning, collaborate across content areas, and pilot new strategies. They guide students in building with AI, critiquing its limitations, and using it to examine democracy, memory, and power. They are not reckless with innovation; they’re intentional, equity-focused, and transparent about what AI can and cannot do.
Trailblazers also recognize that students must learn how to ask hard questions of systems, not just generate answers. Projects in their classrooms often blend social studies content with algorithmic thinking, civic action, and ethical reflection. While some of their work pushes the boundaries of what’s typical in a classroom, it remains rooted in the goals of social studies education: inquiry, citizenship, and justice.
These projects invite students to co-create with AI, interrogate systems, and use emerging tools for civic innovation and justice:
- Lead an AI-powered civic simulation where bots draft policy proposals and students must revise or defend them using constitutional principles
- Guide students to train their own lightweight LLMs on curated primary sources and analyze how outputs differ from general models
- Have students investigate algorithmic bias or digital redlining using AI-generated maps or predictive tools and connect their findings to environmental justice or civil rights issues.
- Have students use AI and local datasets, such as NYC Open Data, to take informed action by proposing policy solutions to real community issues, aligned to social studies standards.
Supporting all educators on the AI journey: A path forward
While archetypes offer a useful lens, sustainable integration of AI in social studies requires system-level support that recognizes where educators are and helps them move forward with clarity and confidence. Below are five key actions for leaders, curriculum teams, and policymakers to consider:
- Differentiate professional learning by archetype
Not every teacher needs the same workshop. Cautious Skeptics benefit from open dialogue on ethics, misinformation, and surveillance. Novices need hands-on time with tools and low-pressure modeling. Designers thrive when they can co-plan, test, and reflect. Trailblazers need the freedom to innovate and the platforms to share. As Guskey (2014) emphasizes, effective professional learning begins with clear goals and a deep understanding of educator readiness and context. Meeting teachers where they are is the foundation of any successful AI implementation plan. - Center instruction, not tools
Educators must be encouraged to treat AI not as a flashy add-on, but as a tool for strengthening existing practices. That means aligning AI integration to instructional goals like sourcing, argument writing, and civic reasoning, not engagement alone. As Fullan (2011) emphasized, meaningful change happens when schools prioritize the right drivers such as capacity building, collaborative work, and instructional improvement, instead of superficial fixes. Technology becomes a distraction when it is disconnected from purpose. Tools must serve pedagogy, not drive it. This also means evaluating AI-generated materials with the same critical lens students are taught to apply to historical sources. - Leverage Professional Learning Communities (PLCs)
Some of the most powerful shifts in practice emerge through sustained, peer-driven collaboration. Districts and schools can embed AI integration into existing PLC structures by identifying and supporting Designers and Trailblazers as lead learners who model and share instructional strategies. Within these communities, Novices can build confidence through co-planning and reflection, while Skeptics are invited to engage in inquiry without pressure. PLCs foster collective efficacy, promote shared responsibility for innovation, and ensure that professional learning remains rooted in classroom practice. - Provide tools, time, and trust
Teachers won’t use what they don’t understand or don’t have time to explore. Access to quality AI tools, along with dedicated time to explore them meaningfully, is essential. As Hernholm (2025) reminds us, capacity grows when schools invest not just in technology, but in the people using it. - Focus on student thinking, not just use
Rather than measuring AI adoption in terms of tool usage, districts should evaluate how it supports disciplinary thinking, civic engagement, and student growth. AI that helps students revise a DBQ, analyze bias, or debate constitutional issues is more impactful than AI used to generate generic content. The goal isn’t AI integration; it’s better thinking.
Final thought: Human teaching still wins the day
Across all four archetypes, whether skeptical, curious, intentional, or trailblazing, one truth holds: AI is only as powerful as the pedagogy behind it. As Michael Fullan (2011) warned more than a decade ago, technology alone doesn’t drive meaningful change. Real impact comes from purposeful design, skilled teaching, and systems that support both.
Used thoughtfully, AI can scaffold reasoning, simplify complex texts, and provide fast, iterative feedback. It can lower the barrier to entry for drafting and help students engage with challenging sources they might otherwise avoid. For multilingual learners and struggling writers, it can act as a helpful drafting partner, not a shortcut, but a springboard.
But the risks are real. Without intentional framing, students may bypass the intellectual heavy lifting that defines social studies. AI can hallucinate facts, misrepresent sources, or mask bias in confident tones. As Dan Meyer (2024) reminds us, AI can do the heavy lifting of generating and organizing, but “we have to help teachers go the last mile.” That last mile is where historical thinking, civic reasoning, and disciplinary literacy live. It’s where students learn to evaluate claims, wrestle with complexity, and build arguments from evidence.
Each archetype contributes to that journey. Skeptics ground us in ethical questions. Novices push us to offer practical supports. Designers model how to integrate tools with intention. Trailblazers show what’s possible when innovation meets purpose.
AI can support great teaching, but it cannot replace it. We are not preparing students to use AI for trivia. We are preparing them to ask hard questions of systems, sources, and society.
That is the heart of social studies.
References
AI for Education. (2024, March 12). Getting started with AI: A guide for educators. https://www.aiforeducation.io/blog/getting-started-with-ai
Clark, C. H., & van Kessel, C. (2024). “I, for one, welcome our new computer overlords”: Using artificial intelligence as a lesson planning resource for social studies. Contemporary Issues in Technology and Teacher Education, 24(2). https://citejournal.org/volume-24/issue-2-24/social-studies/i-for-one-welcome-our-new-computer-overlords-using-artificial-intelligence-as-a-lesson-planning-resource-for-social-studies/
Common Sense Media. (2025, June 26). Deepfakes can be a crime: Teaching AI literacy can prevent it. Retrieved August 3, 2025, from https://www.commonsensemedia.org/kids-action/articles/deepfakes-can-be-a-crime-teaching-ai-literacy-can-prevent-it
Fullan, M. (2011). Choosing the wrong drivers for whole system reform. Centre for Strategic Education. https://theeta.org/wp-content/uploads/2011/11/eta-articles-110711.pdf
Guskey, T. R. (2014). Planning professional learning. Educational Leadership, 71(8), 10–16. Retrieved August 3, 2025, from https://tguskey.com/wp-content/uploads/Professional-Learning-2-Planning-Professional-Learning.pdf
Hernholm, S. (2025, June 19). AI in education: Why teachers need tools, time, and training. Forbes. https://www.forbes.com/sites/sarahhernholm/2025/06/19/ai-in-education-why-teachers-need-tools-time-and-training/
Klein, A. (2025, June 30). Why AI may not be ready to write your lesson plans. Education Week. https://www.edweek.org/technology/why-ai-may-not-be-ready-to-write-your-lesson-plans/2025/06
Meyer, D. (2024, May 3). The difference between great AI and great teaching [Video]. YouTube. https://www.youtube.com/watch?v=iH4Pn4bpOfQ
Sawchuk, S. (2025, July). What teachers should know about ChatGPT’s new Study Mode feature. Education Week. https://www.edweek.org/technology/what-teachers-should-know-about-chatgpts-new-study-mode-feature/2025/07
Szeto, A. (2024a). AI and social studies: Supporting multilingual learners with generative tools. Teaching Social Studies. https://teachingsocialstudies.org/tag/english/
Szeto, A. (2024b). Enhancing Student Learning with AI-Powered Image Features Teaching Social Studies. https://teachingsocialstudies.org/tag/historical-perspectives/
TechCrunch. (2025, July 29). Google’s NotebookLM rolls out video overviews. https://techcrunch.com/2025/07/29/googles-notebooklm-rolls-out-video-overviews/
The White House. (2025, April 23). Executive Order 14277 of April 23, 2025: Advancing artificial intelligence education for American youth. Federal Register, 90, 17519–17523. https://www.govinfo.gov/content/pkg/FR-2025-04-28/pdf/2025-07368.pdf
The White House. (2025, July 23). Winning the race: America’s AI action plan [PDF]. Office of the President of the United States. https://www.whitehouse.gov/wp-content/uploads/2025/07/Americas-AI-Action-Plan.pdf
Wasik, B. (2025, June 16). A.I. is poised to rewrite history. Literally. The New York Times Magazine. https://www.nytimes.com/2025/06/16/magazine/ai-history-historians-scholarship.html
Wineburg, S., & Ziv, N. (2024, October 25). What makes students (and the rest of us) fall for AI misinformation? Education Week. https://www.edweek.org/technology/opinion-what-makes-students-and-the-rest-of-us-fall-for-ai-misinformation/2024/10
