AI for Education – 社区黑料 America's Education News Source Fri, 06 Mar 2026 15:23:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png AI for Education – 社区黑料 32 32 Two New Reports Urge 鈥楬uman-Centered鈥 School AI Adoption /article/two-new-reports-urge-human-centered-school-ai-adoption/ Tue, 03 Mar 2026 11:30:00 +0000 /?post_type=article&p=1029371 Two new reports caution that if schools make missteps implementing AI, the results could haunt them for years, locking them into a future largely written by big tech instead of those closest to kids.

The reports, both the results of small, intensive gatherings of educators, policymakers, researchers, tech officials and students last year, share a common warning: AI in schools must serve human-centered learning that doesn鈥檛 simply push for more efficiency. To do anything else risks creating a generation of young people ill-equipped for the future.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The findings come as young people say they鈥檙e turning to generative AI more than ever: A Pew Research Center survey released last week found that more than half of teens ages 13 to 17 use chatbots to search for information or get help with schoolwork. About four in ten report using AI to summarize articles, books or videos or create or edit images or videos. And about one-in-five say they use chatbots to get news.

For the first report, a group of 18 people met in July in Phoenix. Brought together by , a training and policy organization, and , a digital curriculum company, the treats the question of how schools should view AI as a literal 鈥淐hoose-Your-Own-Adventure鈥 story: The authors lay out three possible scenarios in which educators in an imaginary school district make radically different decisions about the technology.

In the first scenario, the district retreats from AI altogether after a data breach, abandoning a previously created 鈥淚nnovation Lab,鈥 while teachers return to traditional instruction and testing.

The restrictions soon backfire. Students continue using AI at home, but without guidance, take shortcuts on homework, developing a kind of survival mechanism they privately call 鈥渟chool brain.鈥 Seeing how irrelevant most lessons are, they do just enough to get by, offloading thinking to AI tools. When tested, they show shallow understanding and poor foundational skills.

Test scores plummet, college acceptances drop and 40% of graduates land on academic probation. Employers report that graduates can neither work independently nor collaborate effectively with AI. Teachers begin departing in waves.

Retreating from AI, the authors find, creates 鈥渢he worst of both worlds鈥 鈥 students who can neither think independently nor use AI effectively.

In the second scenario, the district, facing competition from AI-driven private schools, goes all-in, adopting a comprehensive, district-wide AI platform for automated instruction. The platform promises greater efficiency via AI tutors, automated grading and behavioral monitoring. And while it initially lowers costs and produces higher test scores, teachers find that students are soon gaming the algorithms rather than learning. The auto-grader penalizes valid but unconventional answers, while multilingual learners are unfairly penalized for non-standard answers on tests.

Teachers find themselves defending grades they didn’t assign and can’t fully explain, while families that challenge grades are stopped by “proprietary algorithms” that even administrators can鈥檛 review. The system delivers 鈥渁 black box鈥 that removes human judgment: 鈥淪tudents could feel the difference between being evaluated by an algorithm and being understood by a teacher.鈥

Before long, graduates struggle with collaboration, creativity and adaptability 鈥 skills employers and colleges increasingly value.

In the report鈥檚 third choice, the district, via its Innovation Lab, redesigns its offerings to prepare students for an AI-driven future while keeping a focus on 鈥渉uman-centered鈥 education. Rather than focusing solely on technology, it develops a 鈥済raduate profile鈥 that emphasizes critical thinking, ethical reasoning and human-AI collaboration, among other indicators.

The lab shifts to flexible, project-based learning, and students soon learn to use AI as a tool that supports but doesn鈥檛 replace their thinking. While the district continues to satisfy state accountability through testing, it also pursues federal innovation grants to fund portfolio-based assessment systems based on the graduate profile.

All is not rosy, though. The redesign is expensive and hard on teachers. Enrollment suffers as political resistance builds steam. But graduates soon demonstrate an ability to critically evaluate AI tools, adapt quickly to workplace changes and develop a 鈥渓earn how to learn鈥 mindset that serves them in the long term. 

Alumni soon report that their 鈥渞obust鈥 portfolios of work are a huge advantage in competitive job markets, and employers say they are the only new hires who critically evaluate AI鈥檚 recommendations, spotting hallucinations and biases.

Amanda Bickerstaff, AI for Education鈥檚 co-founder and CEO, said the first two scenarios are what educators at the July convening said they were seeing most often in schools.

鈥淭here was a strong recognition from everyone, including the students, the two high schoolers, that the traditional methods have not worked 鈥 for decades,鈥 she said. 鈥淏ut it feels safer.鈥

As for going 鈥渁ll in鈥 on AI, she said, that point of view is inevitable in many places, given current aggressive efforts of tech giants like Google who are 鈥減ushing into schools,鈥 going direct to students.

鈥淭here’s this real pressure from both ed tech and AI itself, because it’s such a big market that’s never really been figured out,鈥 she said.

Amanda Bickerstaff

What makes it worse is that few tech firms employ enough teachers to ensure that their products work well for students. 鈥淭hey don’t have hundreds of education people,鈥 Bickerstaff said. Their education teams are 鈥渇ractions of their headcount, working on tools that are instantly in students鈥 hands.鈥

The third path, in which the district redesigns its offerings, is 鈥渢he most human鈥 of the three, she said, and the most intentional. 鈥淭he third path is the one that trusts humans and educators and students and families,鈥 Bickerstaff said.

鈥楨xplicitly ambidextrous鈥 schooling

by the , a think tank at Arizona State University, also calls for a new approach to schools鈥 decisions about AI, saying the technology 鈥渟hould be a catalyst for human-centered learning, not a replacement.鈥

The CRPE report, the result of another gathering in November, asserts that schools are at a pivotal moment. Their AI policies could go one of two ways: They can either entrench outdated educational models or help bring about a fundamental transformation of schooling.

鈥淥ne of the big things that came out of those discussions was a strong feeling among the group that AI is currently being thought of as a productivity tool for the education system that we have, rather than a tool to radically improve teaching and learning and outcomes for kids,鈥 said Robin Lake, CRPE鈥檚 executive director.

During its meeting, the group repeatedly discussed an 鈥渆fficiency paradox鈥 that could make schools faster and cheaper without addressing students鈥 actual needs. To protect against it, they call for a more coherent, human-centered approach that is 鈥渆xplicitly ambidextrous,鈥 improving current practices while intentionally building toward new learning models.

The problem with AI, the report alleges, is that it could simply improve the efficiency of outdated educational models. It notes that the , a time-saving testing technology, for decades reinforced low-level standardized assessments, often at the expense of improved learning.

Instead of using AI as a new kind of Scantron, it says, AI could make way for several innovations, including new assessments that capture real-time performance as students work. It could even measure key non-academic indicators such as belonging, confidence, curiosity and relationship quality.

Robin Lake

Lake said the report鈥檚 idea of an 鈥渁mbidextrous鈥 approach to AI came from an acknowledgement by the group that 鈥渨e have to attend to the kids who are in our schools right now 鈥 and the teachers,鈥 she said. 鈥淲e have to use whatever technologies are available to make things better, but we also have to make investments in big, really different whole-school designs.鈥

Those could include not just better assessments but ways to help teachers provide 鈥渞igorous personalization grounded in the science of learning.鈥

Districts could create classrooms with multiple adults working in teams based on their expertise. And AI could enable schools to match students to internships and other experiences, handling administrative tasks so humans can focus on relationships.

Lake said the group that met in November kept coming back to one idea: Keeping an eye on both the future of school and the reality of the schools we already have.

鈥淎 lot of times when we have these conversations about AI and the future of schooling, it feels very floaty and abstract,鈥 she said. 鈥淪o I really appreciated that the fellows had a vision to connect the here-and-now to what kids need to know and [should] be able to do in the future. That feels really important for us all right now.鈥

]]>