Site icon Frontline Education

From Uncertainty to Action: How Districts Are Preparing Teachers for AI

Media coverage can sometimes give the impression that every district is racing to become the next ‘Alpha School’, fully AI-powered and reinventing instruction overnight. But conversations with real public school leaders tell a different story. For most, progress is steady and strategic: adopting an AI tool, piloting a few trainings, and learning as they go. In fact, districts taking even these initial steps may already be among the top third of adopters. Most are prioritizing understanding over acceleration, balancing innovation with practicality, and ensuring teachers feel supported before expanding further. 

Setting the stage: Why we asked districts about AI learning 

Artificial intelligence has dominated edtech conversations for the past few years, often surrounded by uncertainty, curiosity, and plenty of debate. While headlines tend to spotlight futuristic examples, many district leaders are quietly asking more practical questions: 

At Frontline Education, we’re in the business of equipping K-12 administrators with tools and insights that make their work simpler, so they can focus on what really matters: giving students a path to long-term purpose. Each year, for our K-12 Lens Report, we survey administrators nationwide to take the pulse on emerging priorities and persistent pressures. 

We also check in informally throughout the year with quick polls on timely topics to understand what’s top of mind for educators. AI continues to surface as one of the most consistently charged conversations in education.  

For the past two years, we’ve asked school administrators a simple but revealing question: 

“Have you introduced AI professional learning or support in your district?”  

Here’s what they told us: 

District Adoption of AI Professional Learning: 2024 vs. 2025

While implementation of AI-related professional development nearly doubled in one year, from 17% to 36%, the reality is that a full one-third of districts still haven’t taken concrete steps toward professional learning on AI. The tension between momentum and hesitation is exactly what we wanted to explore further. 

A Conversation with a Curriculum Leader

To unpack what’s driving both the enthusiasm and the hesitation, we sat down with Timothy Smith, Assistant Superintendent of Curriculum and Instruction at Stonington Public Schools in Connecticut

Smith has led his district through a balanced and pragmatic approach to AI in education: equipping teachers and students with opportunities to explore AI safely, while setting clear guardrails and keeping the human side of learning at the forefront.

Our conversation with him sheds light on how districts can move forward with confidence, without overcorrecting or moving too fast.

Why so many still hesitate

As Smith explains, hesitation persists for several understandable reasons, including complexity, confusion, and legitimate concern, not just resistance to innovation.

He points to four friction points every district leader can relate to:

  1. Staff skepticism. Teachers fear losing the human craft of thinking and writing. In some cases, that fear shows up as resistance to student use of AI, especially in subjects where original writing and analysis are central to instruction.
  2. Choice overload. Smith explains, “Because there are so many AI options out there, it’s just too diffuse for folks to get their hands on.” With so many tools, it’s hard to know where to start, or which are safe for students. Teachers need clearer guidance on what’s approved, what’s effective, and how to integrate these tools without overwhelming their workflow.
  3. Outdated training. Pre-recorded modules can’t keep up with AI’s pace of change, leaving many teachers unsure how to adapt new tools to real classroom needs.
  4. Real-world concerns. Not every teacher is captivated by AI. Some approach it with caution, raising ethical questions about its purpose and the environmental cost of large-scale data systems. Energy use and sustainability frequently surface in faculty conversations. Smith explains, “Some staff members have legitimate concerns of contributing to the environmental cost of AI development with data centers and energy consumption.”

Stonington’s approach: Guardrails first, experimentation next

“At the District Office, we’re very committed to making sure all of our students have an experience with AI, not just as a classroom activity, but as part of their workforce and personal development. It’s a major facet of employment across all occupations, and we feel compelled to ensure students encounter it meaningfully in middle and high school.”

– Timothy Smith

At Stonington Public Schools, leadership took a structured approach: narrow the field, protect data, and give teachers confidence to explore. Smith said, “We made the decision to go with a single AI platform and to limit usage to that because telling staff and students that they could use whatever AI they want, whether it’s Claude, ChatGPT, Gemini…it would have been impossible to provide clear direction across so many choices.”

They chose a popular AI app designed specifically for schools, not because it’s the flashiest, but because it’s safe, compliant, and transparent. Tim explained:

The control matters. It lets the district lead with guardrails, not restrictions, offering a practical balance between innovation and accountability. By embedding thoughtful policies into how AI is used, the district connects its broader vision, including student readiness and responsible experimentation, with day-to-day teaching decisions:

Building capacity through meaningful professional learning

“Overall, I think the staff response has been very positive, and realistic. Some teachers have said, ‘I had it do something for me, and you know what? It wasn’t any better than what I could have done myself.’ That kind of honest feedback is healthy. It shows staff are experimenting, but still thinking critically about where AI adds value and where it doesn’t.”

– Timothy Smith

Before launching the PD series, Stonington distributed a districtwide survey to gauge baseline knowledge and perceptions of AI among staff. It asked how familiar teachers were with AI, how they felt about its use in schools, and where they saw potential benefits or risks. The results helped leadership tailor their sessions to meet staff where they were.

Stonington invested in five rounds of PD, tailored to different needs and perspectives across the district. The first session covered the broader landscape: what AI is, what tools are emerging, and how teachers might use them responsibly. Four sessions, co-led with a popular AI edtech app, focused on practical classroom use, from assignment design to permission settings. A fifth session, led by Smith himself, explored prompt building and critical evaluation using Google Gemini, since Stonington is a Google-based district. Additionally, the middle school principal facilitated a hands-on session to help staff translate these concepts into day-to-day practice.

Analytics from their AI app now shape next steps. If usage spikes around lesson creation in a subject with a stable curriculum, that’s a signal to reexamine alignment. If differentiation tools are trending, that’s evidence teachers are seeking ways to meet diverse needs.

Lessons learned: Start small, learn fast, stay human

Rolling out AI in a school district is as much about culture as it is about technology. Smith emphasized that progress depends on trust, communication, and curiosity more than on any particular platform or tool. For him, success comes from modeling openness rather than mandating compliance.

Smith’s biggest insight was that sustainable change doesn’t come from mandates. It comes from motivation.

It’s a strategy that’s working. Stonington’s science teachers, who are naturally curious and data-driven, became early adopters. Their success stories are sparking interest elsewhere and change is starting to spread organically.

And through it all, Smith stays grounded:

For district leaders ready to move from talk to traction

Start with purpose, not products. Define what you want teachers and students to learn from AI, not just what you want them to use.

Pick one tool to pilot and make sure it aligns with your goals and priorities. Maybe you don’t need to biggest, strongest, or fastest model. Maybe, like Stonington, data security, analytics, and guardrails are most important to you. Find a tool that aligns.

Make PD iterative. Train, test, reflect, repeat. If possible, use analytics to see where learning sticks. If not, check in with your staff regularly to learn what they’re using, how they’re using it, and whether it’s helping.

Empower your early adopters. Showcase real teacher examples instead of enforcing compliance.

Keep it human. The goal isn’t to automate teaching, it’s to amplify the human connection at the heart of learning.

To sum up his perspective, Smith shared a vision that balances innovation with caution. He described the district’s next phase as a measured, student-focused expansion that stays grounded in human connection and adaptability.

He also acknowledged the importance of perspective:

And ultimately, he reinforced Stonington’s balanced stance:

The bottom line

The data tells two stories: progress (more districts than ever are implementing AI PD) and hesitation (a third are still waiting on the sidelines). Change in public education is never instant. But as Stonington shows, even modest steps, grounded in values and guardrails, can create meaningful momentum.

“Change doesn’t happen through mandates. It happens when people feel safe to try, reflect, and share what works.”

– Timothy Smith

Support Teachers, Build Confidence, and Move Forward with AI

 

Ellen Agnello

Ellen is a graduate assistant at the University of Connecticut. She is a former high school English language arts teacher and holds a Master’s Degree in literacy education. She is working on a dissertation toward a Ph.D. in Educational Curriculum and Instruction.

Exit mobile version