Center for Democracy & Technology – 社区黑料 America's Education News Source Tue, 16 Apr 2024 14:57:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Center for Democracy & Technology – 社区黑料 32 32 鈥楧istrust, Detection & Discipline:鈥 New Data Reveals Teachers鈥 ChatGPT Crackdown /article/distrust-detection-discipline-new-data-reveals-teachers-chatgpt-crackdown/ Tue, 02 Apr 2024 20:01:00 +0000 /?post_type=article&p=724713 New survey data puts hard numbers behind the steep rise of ChatGPT and other generative AI chatbots in America鈥檚 classrooms 鈥 and reveals a big spike in student discipline as a result. 

As artificial intelligence tools become more common in schools, most teachers say their districts have adopted guidance and training for both educators and students, by the nonprofit Center for Democracy and Technology. What this guidance lacks, however, are clear instructions on how teachers should respond if they suspect a student used generative AI to cheat. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭hough there has been positive movement, schools are still grappling with how to effectively implement generative AI in the classroom 鈥 making this a critical moment for school officials to put appropriate guardrails in place to ensure that irresponsible use of this technology by teachers and students does not become entrenched,鈥 report co-authors Maddy Dwyer and Elizabeth Laird write.

Among the middle and high school teachers who responded to the online survey, which was conducted in November and December, 60% said their schools permit the use of generative AI for schoolwork 鈥 double the number who said the same just five months earlier on a similar survey. And while a resounding 80% of educators said they have received formal training about the tools, including on how to incorporate generative AI into assignments, just 28% said they鈥檝e received instruction on how to respond if they suspect a student has used ChatGPT to cheat. 

That doesn鈥檛 mean, however, that students aren鈥檛 getting into trouble. Among survey respondents, 64% said they were aware of students who were disciplined or faced some form of consequences 鈥 including not receiving credit for an assignment 鈥 for using generative AI on a school assignment. That represents a 16 percentage-point increase from August. 

The tools have also affected how educators view their students, with more than half saying they鈥檝e grown distrustful of whether their students鈥 work is actually theirs. 

Fighting fire with fire, a growing share of teachers say they rely on digital detection tools to sniff out students who may have used generative AI to plagiarize. Sixty-eight percent of teachers 鈥 and 76% of licensed special education teachers 鈥 said they turn to generative AI content detection tools to determine whether students鈥 work is actually their own. 

The findings carry significant equity concerns for students with disabilities, researchers concluded, especially in the face of are ineffective.

]]>
Biden Order on AI Tackles Tech-Enabled Discrimination in Schools /article/biden-order-on-ai-tackles-tech-enabled-discrimination-in-schools/ Tue, 31 Oct 2023 21:01:00 +0000 /?post_type=article&p=717111 Updated Nov. 1

As artificial intelligence rapidly expands its presence in classrooms, President Biden signed an executive order Monday requiring federal education officials to create guardrails that prevent tech-driven discrimination. 

The , which the White House called 鈥渢he most sweeping actions ever taken to protect Americans from the potential risks of AI systems,鈥 offers several directives that are specific to the education sector. The order dealing with emerging technologies like ChatGPT directs the Justice Department to coordinate with federal civil rights officials on ways to investigate discrimination perpetuated by algorithms. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Within a year, the education secretary must release guidance on the ways schools can use the technology equitably, with a particular focus on the tools鈥 effects on 鈥渧ulnerable and underserved communities.鈥 Meanwhile, an Education Department 鈥淎I toolkit鈥 released within the next year will offer guidance on how to implement the tools so that they enhance trust and safety while complying with federal student privacy rules. 

For civil rights advocates who have decried AI鈥檚 potentially unintended consequences, the order was a major step forward. 

The order鈥檚 focus on civil rights investigations 鈥渁ligns with what we鈥檝e been advocating for over a year now,鈥 said Elizabeth Laird, the director of equity and civic technology at the nonprofit Center for Democracy and Technology. Her group has called on the Education Department鈥檚 Office for Civil Rights to open investigations into the ways AI-enabled tools in schools could have a disparate impact on students based on their race, disability, sexual orientation and gender identity. 

鈥淚t鈥檚 really important that this office, which has been focused on protecting marginalized groups of students for literally decades, is more involved in conversations about AI and can bring that knowledge and skill set to bear on this emerging technology,鈥 Laird told 社区黑料. 

In to federal agencies on Wednesday, the Office of Management and Budget spelled out the types of AI education technologies that pose civil rights and safety risks. They include tools to detect student cheating, monitor their online activities, project academic outcomes, make discipline recommendations or facilitate surveillance online and in-person.  

An Education Department spokesperson didn鈥檛 respond to a request for comment Monday on how the agency plans to respond to Biden鈥檚 order. 

Schools nationwide have adopted artificial intelligence in divergent ways, including in to provide students individualized lessons and with the growing use of chatbots like ChatGPT by both students and teachers. It鈥檚 also generated heated debates over technology鈥檚 role in exacerbating harms to at-risk youth, including educators鈥 use of early warning systems that mine data about students 鈥 including their race and disciplinary records 鈥 to predict their odds of dropping out of school. 

鈥淲e鈥檝e heard reported cases of using data to predict who might commit a crime, so very Minority Report,鈥 Laird said. 鈥淭he bar that schools should be meeting is that they should not be targeting students based on protected characteristics unless it meets a very narrowly defined purpose that is within the government鈥檚 interests. And if you鈥檙e going to make that argument, you certainly need to be able to show that this is not causing harm to the groups that you鈥檙e targeting.鈥 

AI and student monitoring tools

An unprecedented degree of student surveillance has also been facilitated by AI, including online activity monitoring tools, remote proctoring software to detect cheating on tests and campus security cameras with facial recognition capabilities. 

Beyond its implications on schools, the Biden order requires certain technology companies to conduct AI safety testing before their products are released to the public and to provide their results to the government. It also orders new regulations to ensure AI won鈥檛 be used to produce nuclear weapons, recommends that AI-generated photos and videos be transparently identified as such with watermarks and calls on Congress to pass federal data privacy rules 鈥渢o protect all Americans, especially kids.鈥

In September, The Center for Democracy and Technology released a report that warned that schools鈥 use of AI-enabled digital monitoring tools, which track students鈥 behaviors online, could have a disparate impact on students 鈥 particularly LGBTQ+ youth and those with disabilities 鈥 in violation of federal civil rights laws. As teachers punish students for using ChatGPT to allegedly cheat on classroom assignments, a survey suggested that children in special education were more likely to face discipline than their general education peers. They also reported higher levels of surveillance and subsequent discipline as a result. 

In response to the report, a coalition of Democratic lawmakers penned a letter urging the Education Department鈥檚 civil rights office to investigate districts that use digital surveillance and other AI tools in ways that perpetuate discrimination. 

Education technology companies that use artificial intelligence could come under particular federal scrutiny as a result of the order, said consultant Amelia Vance, an expert on student privacy regulations and president of the Public Interest Privacy Center. The order notes that the federal government plans to enforce consumer protection laws and enact safeguards 鈥渁gainst fraud, unintended bias, discrimination, infringements on privacy and other harms from AI.鈥 

鈥淪uch protections are especially important in critical fields like healthcare, financial services, education, housing, law and transportation,鈥 the order notes, 鈥渨here mistakes by or misuse of AI could harm patients, cost consumers or small businesses or jeopardize safety or rights.鈥

Schools rely heavily on third-party vendors like education technology companies to provide services to students, and those companies are subject to Federal Trade Commission rules against deceptive and unfair business practices, Vance noted. The order鈥檚 focus on consumer protections, she said, 鈥渨as sort of a flag for me that maybe we鈥檙e going to see not only continuing interest in regulating ed tech, but more specifically regulating ed tech related to AI.鈥

While the order was 鈥減retty vague when it came to education,鈥 Vance said it was important that it did acknowledge AI鈥檚 potential benefits in education, including for personalized learning and adaptive testing. 

鈥淎s much as we keep talking about AI as if it showed up in the past year, it鈥檚 been there for a while and we know that there are valuable ways that it can be used,鈥 Vance said. 鈥淚t can surface particular content, it can facilitate better connections to people when they need certain content.鈥 

AI and facial recognition cameras

As school districts pour billions of dollars into school safety efforts in the wake of mass school shootings, security vendors have heralded the promises of AI. Yet civil rights groups have warned that facial recognition and other AI-driven technology in schools could perpetuate biases 鈥 and could miss serious safety risks. 

Just last month, the gun-detection company Evolv Technology, which pitches its hardware to schools, acknowledged it was the subject of a Federal Trade Commission inquiry into its marketing practices. The agency is reportedly probing whether the company employs artificial intelligence in the ways that it claims. 

In September, New York became the first state to , a move that followed outcry when an upstate school district announced plans to roll out a surveillance camera system that tracked students鈥 biometric data. 

A new Montana law bans facial recognition statewide with one notable exception 鈥 . Citing privacy concerns, the law adopted this year prohibits government agencies from using facial recognition, but with a specific carveout for schools. One rural education system, the 250-student Sun River School District, employs a 30-camera security system from Verkada that uses facial recognition to track the identities of people on its property. As a result, the district has a camera-to-student ratio of 8-to-1. 

In an email on Wednesday, a Verkada spokesperson said the company is in the process of reviewing Biden’s order to understand its implications on the company.

Verkada offers a cautionary tale about the potential security vulnerabilities of campus surveillance systems. In 2021, the company suffered a massive data breach and hackers claimed to expose the live feeds of 150,000 surveillance cameras 鈥 including those in place at Sandy Hook Elementary School in Newtown, Connecticut, the site of a mass shooting in 2012.聽A conducted on behalf of the company found the breach was more limited, affecting some 4,500 cameras.

Hikvision has similarly made inroads in the school security market with its facial recognition surveillance cameras 鈥 including during a pandemic-era push to enforce face mask compliance. Yet the company, owned in part by the Chinese government, has also faced significant allegations of civil rights abuses and in 2019 was placed on a U.S. trade blacklist after being implicated in the country鈥檚 鈥渃ampaign of repression, mass arbitrary detention and high-technology surveillance鈥 against Muslim ethnic minorities. 

Though multiple U.S. school districts continue to use Hikvision cameras, a recent investigation found the company鈥檚 software despite claiming for years it had ended the practice.

 In an email, a Hikvision spokesperson didn鈥檛 comment on how Biden’s executive order could affect its business, including in schools, but offered a letter it shared to its customers in response to the investigation, saying an outdated reference to ethnic detection appeared on its website erroneously.

鈥淚t has been a longstanding Hikvision policy to prohibit the use of minority recognition technology,鈥 the letter states. 鈥淎s we have previously stated, that functionality was phased out and completely prohibited by the company in 2018.鈥

Data scientist David Riedman, who built a national database to track school shootings dating back decades, said that artificial intelligence is at 鈥渢he forefront鈥 of the school safety conversation and emerging security technologies can be built in ways that don鈥檛 violate students鈥 rights. 

Riedman became a figure in the national conversation about school shootings as the creator of the K12 School Shooting Database but has since taken on an additional role as director of industry research and content for ZeroEyes, a surveillance software company that uses security cameras to ferret out guns. Instead of using facial recognition, the ZeroEyes algorithm was trained to identify and notify law enforcement within seconds of spotting a firearm. 

The 鈥 as opposed to facial recognition 鈥 can 鈥渆vade privacy and bias concerns that plague other AI models,鈥 and internal research found that 鈥渙nly 0.06546% of false positives were humans detected as guns.鈥 

鈥淭he simplicity鈥 of ZeroEye鈥檚 technology, Riedman said, puts the company in good standing as far as the Biden order is concerned.

鈥淶eroEyes isn鈥檛 looking for people at all,鈥 he said. 鈥淚t鈥檚 only looking for objects and the only objects it is trying to find, and it鈥檚 been trained to find, are images that look like guns. So you鈥檙e not getting student records, you鈥檙e not getting student demographics, you鈥檙e not getting anything related to people or even a school per se. You just have an algorithm that is constantly searching for images to see if there is something that looks like a firearm in them.鈥

However, false positives remain a concern. Just last week at a high school in Texas, from ZeroEyes prompted a campus lockdown that set off student and parent fears of an active shooting. The company said the false alarm was triggered by an image of a student outside who the system believed was armed based on shadows and the way his arm was positioned. 

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool鈥檚 鈥渞ole in negatively impacting LGBTQ students.鈥

鈥淥ur philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,鈥 the nonprofit said in the statement. 鈥淲e hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.鈥 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project鈥檚 鈥済uidance on how to do what we do better.鈥 The company also where it previously touted the partnership. 

鈥淲e鈥檙e disappointed that The Trevor Project has decided to pause our collaboration,鈥 she said. 鈥淗owever, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.鈥 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project鈥檚 work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle鈥檚 surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

鈥淚t really does feel like a 鈥榃e paid you, now say we鈥檙e fine,鈥 kind of thing,鈥 said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle鈥檚 鈥渟eal of approval鈥 from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

鈥淧eople who want to defend Gaggle can just point to their little Trevor Project thing and say, 鈥楽ee, they have the support of 鈥淭he Gays鈥 so it鈥檚 fine actually,鈥 and all it does is make it easier to deflect and defend actual issues with Gaggle.鈥 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among 鈥淐orporate Partners鈥 on The Trevor Project鈥檚 website (screenshot)

Following an investigation by 社区黑料 into Gaggle鈥檚 monitoring practices, the company . Gaggle鈥檚 algorithm relies on keyword matching to compare students鈥 online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are 鈥済ay鈥 and 鈥渓esbian,鈥 verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance 鈥 a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it鈥檚 important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle鈥檚 鈥渄esire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.鈥 

鈥淚t鈥檚 true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring 鈥 many worry that these tools could out them to teachers or parents against their will,鈥 the statement continued. 鈥淚t is because of that very real concern that we have worked in a limited capacity with digital safety companies 鈥 to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.鈥 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is 鈥渉onored to be able to align with The Trevor Project to better serve LGBTQ youth,鈥 and that the company is 鈥渁lways looking for ways to learn and to improve upon what we do to better support students and keep them safe.鈥 

鈥楩aceless bureaucracy鈥 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students鈥 school-issued Microsoft and Google accounts, Gaggle seeks to alert educators 鈥 and in some cases the police 鈥 of students’ online behaviors that suggest they might harm themselves or others.

鈥淚t really is about collecting details that kids may not be voluntarily sharing 鈥 information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,鈥 Vance said. At The Trevor Project, 鈥測ou have proactive outreach from youth who know that they need help or they need a community.鈥 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy鈥檚 and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it鈥檚 ultimately up to youth to decide which information they share with adults. 

It鈥檚 important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where 鈥渟ome faceless bureaucracy is finding out and informing your parents鈥 about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn鈥檛 the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic 鈥 one that鈥檚 in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project鈥檚 suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity 鈥 typically called 鈥渙uting鈥 鈥 due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students鈥 civil liberties and to state its intent 鈥渢o take enforcement action against violations that result in discrimination.鈥 The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group鈥檚 co-founder and executive director, said The Trevor Project鈥檚 partnership with Gaggle could be positive if it鈥檚 used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company 鈥渋s proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.鈥 (Screenshot)

鈥淚f that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they鈥檝e done is harm the student,鈥 Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking 鈥渢o avoid liability issues.鈥  

鈥淚t is our stance that since these tools are not going anywhere, we think it鈥檚 important to do our part to offer our expertise around LGBTQ experiences,鈥 the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

鈥淚 have friends who are queer and/or trans who are out at school but not to their parents,鈥 he said. 鈥淚f they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.鈥 

In The Trevor Project鈥檚 recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like 鈥済ay鈥 and 鈥渓esbian鈥 can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he鈥檚 been called just about every 鈥渟lur for a queer or a trans person that isn鈥檛 from like 80 years ago.鈥 While slurs are common, terms like 鈥渓esbian鈥 are not.

鈥淎s an actual teenager going to an actual public school, those words are not being used to bully people,鈥 he said. 鈥淭hey鈥檙e just not.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Senate Inquiry Warns About Harms of Digital School Surveillance Tools /article/senate-inquiry-warns-about-harms-of-digital-school-surveillance-tools-calls-on-fcc-to-clarify-student-monitoring-rules/ Mon, 04 Apr 2022 21:37:00 +0000 /?post_type=article&p=587388 Updated, April 5

Democratic Sens. Elizabeth Warren and Ed Markey are calling on the Federal Communications Commission to clarify how schools should monitor students鈥 online activities, that educators鈥 widespread use of digital surveillance tools could trample students鈥 civil rights.

They also want the U.S. Education Department to start collecting data on the tools that could highlight whether they have disproportionate 鈥 and potentially harmful 鈥 effects on certain student groups. 

In October, the senators asked four education technology companies that keep tabs on the online activity of millions of students across the country 鈥 often 24 hours a day, seven days a week 鈥 to provide information on how they use artificial intelligence to glean their information. 

Based on their responses, the senators said:

  • The companies鈥 software may be misused to identify students who are violating school disciplinary rules. They cited a recent survey where 43% of teachers reported their schools employ the monitoring systems for this purpose, potentially increasing contact between police and students and worsening the school-to-prison pipeline.
  • The companies have not attempted to determine whether their products disproportionately target students of color, who already face harsher and more frequent school discipline, or other vulnerable groups, like LGBTQ youth.
  • Schools, parents and communities are not being appropriately informed of the use 鈥 and potential misuse 鈥 of the data. Three of the four companies indicated they do not directly alert students and guardians of their surveillance.

Warren and Markey concluded a dire 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥

鈥淲hile the intent of these products, many of which monitor students鈥 online activity around the clock, may be to protect student safety, they raise significant privacy and equity concerns,鈥 the lawmakers wrote. 鈥淪tudies have highlighted unintended but harmful consequences of student activity monitoring software that fall disproportionately on vulnerable populations.鈥

An FCC spokesperson said they鈥檙e reviewing the and an Education Department spokesperson said they 鈥渓ook forward to corresponding with the senators鈥 about its findings.

Lawmakers鈥 inquiry into the business practices of school security companies Gaggle, GoGuardian, Securly and Bark Technologies is the first congressional investigation into student surveillance tools, whose use grew dramatically during the pandemic when  learning shifted online.

It follows on the heels of investigative reporting by 社区黑料 into Gaggle, which uses artificial intelligence and a team of human content moderators to track the online behaviors of more than 5 million students. 社区黑料 used public records to expose how Gaggle鈥檚 algorithm and its hourly-wage workers sift through billions of student communications each year in search of references to violence and self harm, subjecting youth to constant digital surveillance with steep implications for their privacy. Gaggle, whose tools track students on their school-issued Google and Microsoft accounts, reported a during the pandemic.

Bark didn鈥檛 respond to requests for comment. Securly spokesman Josh Mukai said in a statement that the company is reviewing the senators鈥 March 30 report and looks forward 鈥渢o continuing our dialogue with Senators Warren and Markey on the important topics they have raised.鈥

鈥淧arents expect that schools will keep children safe while in the classroom, on a field trip or while riding on a bus,鈥 GoGuardian spokesman Jeff Gordon said in a statement. 鈥淪chools also have a responsibility to keep students safe in digital spaces and on school-issued devices.鈥 

Gaggle Founder and CEO Jeff Patterson submitted a statement after this article was published. He said the company is reviewing the lawmakers鈥 recommendations 鈥渢o assess how we can further strengthen our work to better protect students.鈥

鈥淲e want to ensure our technology is effectively supporting student safety without creating unintended risks or harms,鈥 Patterson continued. 鈥淲e have taken steps over the years to ensure effective privacy protections and mitigate bias in our platform, but welcome continued dialogue that will help make sure tools like Gaggle can continue to be used to support students and educators.鈥

Bark Technologies CEO Brian Bason wrote in a letter to  lawmakers that AI-driven technology could be used to solve the country鈥檚 鈥渢errible history of bias in school discipline鈥 by removing the decisions of individual teachers and administrators.

鈥淲hile any system, including AI-based solutions, inherently have some bias, if implemented correctly AI-based solutions can substantially reduce the bias that students face,鈥 Bason wrote.

As to the question of whether their surveillance exacerbates the school-to-prison pipeline,  the companies鈥 letters acknowledge in certain cases they contact police to conduct welfare checks on students. Securly noted in its letter that in some instances, education leaders 鈥減refer that we contact public safety agencies directly in lieu of a district contact.鈥

Under the Clinton-era , passed in 2000, public schools and libraries are required to filter and monitor students鈥 internet use to ensure they don鈥檛 access material 鈥渉armful to minors,鈥 such as pornography. Districts have cited the law to justify the adoption of AI-driven surveillance tools that have proliferated in recent years. Student privacy advocates argue the tools go far beyond the federal mandate and have called on the FCC to clarify the law鈥檚 scope. Meanwhile, advocates have questioned whether schools鈥 use of digital surveillance tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures.

In a recent survey by the nonprofit Center for Democracy and Technology, 81 percent of teachers said they used software to track students鈥 computer activity, including to block obscene material or monitor their screens in real time. A majority of parents said they worried about student data getting shared with the police and more than half of students said they decline to share their 鈥渢rue thoughts or ideas because I know what I do online is being monitored.鈥  

Elizabeth Laird, the group鈥檚 director of equity in civic technology, said it has been calling on student surveillance companies to be more transparent about their business practices but it鈥檚 鈥渄isappointing that it took a letter from Congress to get this information.鈥 She said she hopes the FCC and Education Department adopt lawmakers鈥 recommendations.

鈥淣one of these companies have researched whether their products are biased against certain groups of students,鈥 she said in an email while questioning their justification for holding off on such an inquiry. 鈥淭hey cite privacy as the reason for not doing so while simultaneously monitoring students鈥 messages, documents and sites visited 24 hours a day, seven days a week.鈥 

社区黑料鈥檚 investigation, which used data on Gaggle鈥檚 foothold in Minneapolis Public Schools, failed to identify whether the tool鈥檚 algorithm disproportionately targeted Black students, who are more often subjected to student discipline than their white classmates. However, it highlighted instances in which keywords like 鈥済ay鈥 and 鈥渓esbian鈥 were flagged, potentially subjecting LGBTQ youth to heightened surveillance for discussing their sexual orientation. 

Amelia Vance, an attorney and student privacy expert, said she was intrigued that the companies pushed back on the idea that their tools are used to discipline students since the federal monitoring requirement was meant to keep kids from consuming inappropriate content online and likely face consequences for viewing violent or sexually explicit materials. She agreed the companies should research their algorithms for potential biases and would benefit from additional transparency. 

However, Vance said in an email that FCC clarification 鈥渨ould do little at best and may provide counterproductive guidance at worst.鈥 Many schools, she said, are likely to use the tools regardless of the federal rules. 

鈥淪chools aren鈥檛 required to monitor social media, and many have chosen to do so anyway,鈥 said Vance, the co-founder and president of Public Interest Privacy Consulting. Some school safety advocates are actively lobbying lawmakers to expand student monitoring requirements, she said. 

Asking the FCC to issue guidance 鈥渃ould actually be counterproductive to the goal of limiting monitoring and ensuring more privacy protections for students since it is possible that the FCC could require a higher level of monitoring.鈥

Read the letters from Gaggle, GoGuardian, Securly and Bark Technologies: 

]]>