Article heros 90

A Product Owner’s Guide to How Generative AI is Changing Education

When we first surveyed educators about their use of generative AI tools in 2024, teachers were lukewarm about the technology.

At the time, only 37% of our Teacher Council members used AI in their day-to-day work. 

Just a year later, the landscape has shifted dramatically.

Thanks to increasing AI investment and partnerships like the one announced by the American Federation of Teachers, educators are now experimenting more frequently with AI tools.

They see services like ChatGPT, Claude, and Magic School as tools for:

  • cutting down on administrative work
  • differentiating learning content for their students
  • and compiling resources from across the web

As educators become more familiar with AI workflows, edTech product owners can support their needs for streamlining administrative tasks, supporting student learning outcomes, and protecting student privacy.

In this article, we’ll share new studies on educator use of AI, outline remaining challenges with the technology, and offer design recommendations you can apply to your learning tools.

Let’s dive in.

As educator attitudes toward AI shift, new needs emerge for edTech

According to Carnegie Learning, 40-50% of teachers now use AI tools weekly and view the technology more positively than they did in 2024.

In fact, educator use of AI tools has only increased in the past year. 

For example, the number of educators using Magic School jumped 25%. Canva and Google Translate had even bigger leaps in adoption.

With increasing adoption comes new user needs.

The biggest needs for your product’s users?

  • More professional development resources about genAI use
  • User flows that reduce administrative workloads
  • Features that support learning content creation

How to support professional development and training needs in your learning tool

Educators are hungry for professional development and training when it comes to using genAI.

EdTech product owners can support this need by embedding professional development resources directly into their tools.

Strong onboarding flows, professional learning content, and prompt libraries will all support educators as they become more comfortable using generative AI and discover new ways to apply it to their work.

How to help educators streamline administrative work

Technology may change, but one thing has stayed the same throughout our time working with educators: they never have enough time in the day.

In our recent report on the edTech landscape, educators reported that their administrative workloads have skyrocketed.

Tasks like filling out forms, entering grades, and conducting family outreach eat into their lesson planning and teaching time.

New research published this summer shows that generative AI tools help educators reduce the amount of admin on their plate.

According to the Gallup-Walton Family Foundation, teachers who use AI tools weekly are now saving an average of 5.9 hours per week. 

This adds up to six weeks of time saved over the course of the school year!

As you prioritize AI integration, consider how AI-powered features can reduce the administrative workloads of educators.

EdTech products that help educators automate repetitive tasks or streamline clunky workflows will stand out in a crowded market.

How to help educators use genAI to personalize content for students

In addition to saving educators time, genAI tools are also helping educators provide more personalized content for their students.

Educators are:

  • creating differentiated content, in order to support students at different levels
  • standardizing grading by automating and reusing common pieces of feedback
  • translating materials for students or families into different languages

From translation integrations to differentiated content production, generative AI can support educators’ accessibility needs, streamline their workflows, and provide more personalization.

As the technology for generative AI improves, product owners who prioritize these offerings in their learning tools will also make strong investments in improved accessibility.

Because of ongoing privacy and accuracy concerns, however, you should also consider ways to help educators maintain manual oversight of AI-generated content. 

By building features that support educator review of AI-generated materials, you can give users desired levels of control over both AI inputs and outputs.

How to address educator concerns about AI in your edTech product

As educators have experimented with AI over the past year, their attitudes toward the technology became more positive. 

However, educators remain concerned about issues like:

  • Data privacy and transparency
  • Student use of AI tools 
  • AI literacy
  • And widening equity gaps

EdTech companies must address these issues with clarity and transparency if they wish to earn the trust of their user base.

Here’s how you can do it.

Address data privacy needs

According to the Carnegie Learning survey, 76% of educators have concerns about privacy in regards to AI tools—a figure that has stayed roughly the same over the past year. 

Given the hurdles district leaders face when it comes to measuring and maintaining student data privacy at school, this level of concern makes sense.

A report published by the Consortium for School Networking identified a range of internal challenges when it comes to managing student data privacy. These include:

  • Lack of privacy policies
  • Inability to manage employee privacy practices
  • Inability to manage technologies introduced by teachers
  • And a need for guidance on federal and state privacy laws

Although policies and regulations are rapidly changing at both the school and state level, edTech companies can be proactive about addressing privacy concerns.

Connect with your development team early and often about how AI-powered features align with federal and state privacy guidelines.

As you gain clarity internally, communicate clearly and transparently with your users about how tools use generative AI, where tools source data, and how your company is working to safeguard student privacy.

Help students engage in productive struggle

Last year, our survey on teacher attitudes toward AI found that educators worried about the following when it came to students using AI tools:

  • misuse/cheating (80%)
  • a decrease in critical thinking skills (63%)
  • and dependency on technology (60%)

In fact, we discovered that 70% of our Teacher Council members discouraged any student use of AI.

While AI-checkers or other features designed to expose cheating will certainly put a bandaid on the problem, they’re not necessarily the best solutions.

Instead, consider ways you can help students think critically and productively struggle with assignments, even as they develop AI skillsets.

This might look like allowing educators to set assignment requirements within your tool, so students lack access to AI-powered features before developing their own ideas about a question or assignment.

Content engineers might also consider developing prompts that ask students to reflect on AI-assisted work, helping students to think critically about their own interactions with these new tools.

Help users develop AI literacy skills

While there are still many unknowns about how AI will impact the workforce of the future, we do know generative AI isn’t going away.

As it becomes more sophisticated, it’s crucial for students to develop AI literacy skills. 

This includes being able to tell the difference between content generated by AI and content generated by humans, especially video and image-based content.

When it comes to written AI output, students need practice evaluating writing for accuracy and meaning. 

They also need practice editing drafts to infuse their own point of view, sourcing and adding pieces of evidence, and interrogating the clarity of AI-generated work.

edTech products can play important roles in helping students to distinguish real content from sophisticated fakes, as well as developing the evaluative skills that will help them succeed in the future. 

Address equity issues in training & adoption

According to research conducted by The Center on Reinventing Public Education at Arizona State University, suburban, majority-white, and higher-income school districts are about twice as likely to offer AI training for teachers compared to urban, rural, or high-poverty districts.

Equity gaps in training and technology access have long affected student access to many edTech products—not just those with AI features.

In our own research and experience designing learning tools, we’ve found that it pays to invest in addressing accessibility and equity gaps through product design.

More equitable learning tools often generate word-of-mouth support from teachers who become super users, improving engagement, fidelity, and product adoption.

After all, if you’re only designing tools that a small number of students can use, then you’re cutting yourself off from a larger user base—and higher adoption rates.

By researching the technology of your user base, prioritizing mobile-first designs, and testing frequently in classroom settings, you can develop more accessible AI-powered edTech products.

Final thoughts

Generative AI has changed how learning content is created and how students tackle assignments, but it’s also introduced new and evolving concerns about student privacy, academic integrity, and content accuracy.

edTech product owners will play a special role in helping their product teams integrate AI technology without losing sight of the unique needs and challenges educators face. 

This will likely require updating your user research regularly, in order to keep pace with changing views and usage habits.

After all, when you can tie technology investments and feature priorities directly to verified user needs, your improvements will be more successful—for users, and for the long term sustainability of your learning tools. 

Whether you need support researching educator’s experience needs, their views about technology, or AI use cases for your product, Backpack Interactive’s team of experts can help you uncover actionable user data and identify strategic priorities for your product team.

Contact us below to learn more about how we can help.

Monica Sherwood

Monica Sherwood

UX Research Lead

Prior to entering the UX field, Monica was a special educator at public schools in Brooklyn and Manhattan. Her experience as a teacher has allowed her to develop a deep appreciation for research, and the ability to empathize with the unique needs of every user. She is also a strong advocate for inclusion and accessibility in design.

Monica obtained her undergraduate degree at NYU’s School of Individualized Study, and her Masters in Special Education at Hunter College. In her free time, she enjoys traveling, painting, and reading.

Illustration of a spaceship

Let’s build the future of edTech together.