Office of Science and Technology Policy – 社区黑料 America's Education News Source Wed, 05 Oct 2022 15:47:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Office of Science and Technology Policy – 社区黑料 32 32 White House Cautions Schools Against 鈥楥ontinuous Surveillance鈥 of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to 鈥渃ontinuous surveillance鈥 if the use of digital monitoring tools 鈥 already accused of targeting at-risk youth 鈥 are likely to trample students鈥 rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out 鈥渋n favor of a human alternative.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools鈥 use of artificial intelligence that 鈥渄efine specifications for the safety, fairness and efficacy of AI models used within education鈥 and introduce 鈥済uardrails that build on existing education data privacy regulations.鈥 

During , Education Secretary Miguel Cardona said officials at the department 鈥渆mbrace utilizing Ed Tech to enhance learning鈥 but recognize 鈥渢he need for us to change how we do business.鈥 The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children鈥檚 information is being used.

鈥淭his has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,鈥 he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a 鈥渕assive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.鈥 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by 社区黑料 on schools鈥 growing use of digital surveillance and the tech鈥檚 impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes 鈥渦nilateral federal action to prohibit鈥 the software.

鈥淭hat鈥檚 not the appropriate role of the federal government to come and say this cannot happen,鈥 she said. 鈥淏ut smart guardrails that allow for good practices, that protect students鈥 safety and privacy, that鈥檚 a more appropriate role.鈥

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear 鈥渢he ways in which algorithmic systems can deepen inequality.鈥 

鈥淲e commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,鈥 CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks 鈥 particularly for students of color and LGBTQ youth 鈥 and promoted a 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥 Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy 鈥渉ad no idea this was coming,鈥 and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document鈥檚 failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students鈥 race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

鈥淥f course there are a lot of privacy and equity and ethical landmines in this area,鈥 Vance said. 鈥淏ut we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.鈥 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint鈥檚 on-the-ground implications, but that student data privacy efforts present schools with 鈥渁 balancing act.鈥

鈥淵ou want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn鈥檛 working,鈥 she said. 鈥淪ometimes that鈥檚 broader than just a pure academic indicator.鈥

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they鈥檝e highlighted how student data can wind up in the hands of the police.

鈥淲hen police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,鈥 he said in a media release. 鈥淲hile the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.鈥

]]>