school surveillance – 社区黑料 America's Education News Source Wed, 04 Oct 2023 15:47:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png school surveillance – 社区黑料 32 32 New Report: School Shootings Spawned ‘Digital Dystopia’ of Student Surveillance /article/new-report-school-shootings-spawned-digital-dystopia-of-student-surveillance/ Tue, 03 Oct 2023 18:48:00 +0000 /?post_type=article&p=715730 Updated, Oct. 4

Reeled in by deceptive, fear-based marketing and an influx of federal cash, school leaders have purchased and pervasively deployed student surveillance tools while failing to consider their detrimental consequences to young people鈥檚 civil rights, a new ACLU report concludes. 

In a youth survey accompanying the , a majority of students expressed worries that the tools 鈥 designed to keep them safe 鈥 could actually cause harm and a third said they 鈥渁lways feel鈥 like they鈥檙e being watched. 

The 61-page report, titled 鈥淒igital Dystopia,鈥 also offers an in-depth look at the rise of schools鈥 reliance on surveillance technology over the last few decades, arguing the tools have failed to improve campus safety while subjecting students 鈥 particularly students of color and those who are undocumented, LGBTQ or from low-income households 鈥 to discrimination. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭he ed tech surveillance companies, after fanning the flames of fear, were making these broad statements about the efficacy of their products, about their ability to keep students safe鈥 from threats like school shootings and suicide, despite a lack of evidence to back up their claims, report lead author and ACLU senior policy counsel Chad Marlow told 社区黑料. 

Rather than making kids safe, Marlow said, the tools could be damaging to their development and well-being. 鈥淭he harm is actually significant and, by not acknowledging the harms that are caused, there鈥檚 less incentive to look at other interventions,鈥 he said.

ACLU

Three-quarters of students worry about at least one negative consequence of student surveillance, which includes the widespread proliferation of digital tools that monitor their online communications for references to sex, drugs, violence or self-harm, according to the online survey. Commissioned by the ACLU, the polling firm YouGov queried 502 teens throughout the country in October 2022. Nearly a quarter of respondents said that digital monitoring tools limit the resources they feel they can access online while a similar percentage worried the information collected about them could be shared with the police or be used against them in the future by a college or an employer. Some 27% feared the tools could be used for disciplinary purposes.

As a result, students alter their behaviors due to fears that 鈥渄eviating from expectations is punishable in the world that they鈥檙e growing up in,鈥 Marlow said. 鈥淲hat does that tell them about innovation or exploring new ideas?鈥

Survey findings , released last month by the nonprofit Center for Democracy and Technology, which found that while a majority of parents and students still embrace digital tools that monitor students鈥 online behaviors, their support has dwindled over the last year. 

Both reports identified detrimental effects of digital surveillance that researchers said run counter to federal civil rights laws that protect students from discrimination based on race, disability, sexual orientation or gender identity. 

In the student survey conducted by the Center for Democracy and Technology, researchers found that while districts bought digital monitoring tools to keep students safe, they are used regularly as discipline tools that routinely bring youth in contact with the police. LGBTQ+ youth and those with disabilities were significantly more likely to experience the harms of surveillance. For example, 65% of LGBTQ+ youth said they or someone they knew got into trouble due to online activity monitoring, compared to 56% of their straight and cisgender peers. Meanwhile, nearly a third of LGBTQ+ students said that they or someone they know has been 鈥渙uted鈥 by the technology.

In the absence of rigorous, independent research on the efficacy of school surveillance tools to improve campus safety, the ACLU report argues that schools are left to make purchasing decisions based on what the group called fear-based marketing tactics. Security companies hype the risks of school violence and student self-harm while overstating the utility of their products, the report says. Security industry lobbying efforts, meanwhile, have successfully steered hundreds of millions of dollars in government school safety spending toward unproven technologies. 

鈥淚t would be like going to buy a car and the only source of information is the car salesperson,鈥 Marlow said. 鈥淭hat鈥檚 probably not the best way to make a car purchasing decision, but that鈥檚 what鈥檚 happening with student surveillance.鈥 

The Security Industry Association, a trade group that represents security companies and lobbies on their behalf, didn鈥檛 immediately respond to a request for comment. 

The ACLU survey results suggest, however, that students have a complicated relationship with school surveillance: While recognizing its potential harms, many also believe it serves its intended purpose. Specifically, 40% of students reported that surveillance technology makes them feel 鈥渟afe鈥 and 43% said it makes them feel 鈥減rotected.鈥 Meanwhile, just 14% said it makes them feel 鈥渁nxious鈥 and a fraction of respondents, 7%, said the tools made them feel 鈥渦nsafe.鈥 

Marlow said this support may be the result, at least in part, of successful marketing and a belief that few other options exist. 

鈥溾嬧媁hen you talk about keeping students safe, I think students are smart enough to realize that in too many places in this country, gun control is off the table,鈥 he said. 鈥淏ecause of the dominance of money and power of the ed tech surveillance industry,鈥 that鈥檚 used in marketing and lobbying, 鈥渢he discussion is almost entirely centered around, 鈥楧o we use or do we not use student surveillance technologies?鈥 while alternatives like mental health screenings fail to receive similar consideration. 鈥淚n that option, between a highly questionable, harmful protection or nothing at all, no one wants to pick nothing at all.鈥 

While the report focuses largely on digital tools that monitor students鈥 behaviors online, it also questions the efficacy of surveillance cameras in creating physical safety for students in schools. Cameras have become nearly ubiquitous, with them in the 2019-20 school year, according to the most recent data included in a U.S. Department of Education report released last month. 

Meanwhile, just 55% of schools offered students mental health assessments, according to the most recent federal data, and 42% offered mental health treatment services. 

Despite a sharp rise in schools鈥 reliance on surveillance and other tools in the last two decades, the number of school shootings has grown. 

There were a record 188 school shootings resulting in injuries or deaths in the 2021-22 school year, according to the federal report. That鈥檚 twice as many shootings on campus than the previous record 鈥 set just one year earlier. Placing security cameras in schools, Marlow argues, has failed to deter the very crimes they were installed to prevent. In an ACLU analysis of the 10 deadliest school shootings in the last two decades, for example, researchers found that surveillance cameras were present for eight, including in Parkland, Florida, and Uvalde, Texas. 

Along with scrutiny from researchers and civil rights groups, schools鈥 use of digital monitoring tools has led to several lawsuits alleging they鈥檙e ineffective and violate students鈥 civil liberties. 

In one class-action lawsuit, filed this year in California, the parents of two students claim the student surveillance company and sold the information to targeted advertising vendors without their knowledge or consent. 

A separate federal negligence lawsuit, filed in 2021 in Oklahoma, of being ineffective at keeping kids safe from self-harm. The lawsuit, filed by the parents of a 15-year-old boy who died by suicide, accuses the surveillance company and the state鈥檚 third-largest school district of failing to act on warning signs that could have prevented the teenager鈥檚 2019 death. 

The student submitted a 鈥減ersonal odyssey鈥 essay in his freshman English class that was riddled with references to self-harm and suicide, but his teacher failed to act, the complaint alleges, giving him a grade of 100%. The district used Gaggle to identify and flag troubling student digital communications, including references to self-harm and suicide. Yet the lawsuit alleges the company 鈥渇ailed to notify school administration鈥 about the student鈥檚 warning signs, including the essay titled 鈥淩unning Out of Reasons鈥 and an email with a classmate where the two contemplated a plan to 鈥済o out at the same time.鈥

A Gaggle spokesperson didn鈥檛 immediately respond to a request for comment. Securly spokesperson Josh Mukai called the lawsuit 鈥渂aseless and uninformed.鈥

鈥淪ecurly has never sold student data to third parties, nor have we ever used student data to target advertisements,鈥 Mukai said in an email. 鈥淪ecurly鈥檚 suite of student safety solutions upholds the highest standards for student data privacy and complies with all international, federal and state privacy regulations.鈥

]]>
ChatGPT Is Landing Kids in the Principal鈥檚 Office, Survey Finds /article/chatgpt-is-landing-kids-in-the-principals-office-survey-finds/ Wed, 20 Sep 2023 04:01:00 +0000 /?post_type=article&p=715056 Ever since ChatGPT burst onto the scene last year, a heated debate has centered on its potential benefits and pitfalls for students. As educators worry students could use artificial intelligence tools to cheat, a new survey makes clear its impact on young people: They鈥檙e getting into trouble. 

Half of teachers say they know a student at their school who was disciplined or faced negative consequences for using 鈥 or being accused of using 鈥 generative artificial intelligence like ChatGPT to complete a classroom assignment, , a nonprofit think tank focused on digital rights and expression. The proportion was even higher, at 58%, for those who teach special education. 

Cheating concerns were clear, with survey results showing that teachers have grown suspicious of their students. Nearly two-thirds of teachers said that generative AI has made them 鈥渕ore distrustful鈥 of students and 90% said they suspect kids are using the tools to complete assignments. Yet students themselves who completed the anonymous survey said they rarely use ChatGPT to cheat, but are turning to it for help with personal problems.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭he difference between the hype cycle of what people are talking about with generative AI and what students are actually doing, there seems to be a pretty big difference,鈥 said Elizabeth Laird, the group鈥檚 director of equity in civic technology. 鈥淎nd one that, I think, can create an unnecessarily adversarial relationship between teachers and students.鈥   

Indeed, 58% of students, and 72% of those in special education, said they鈥檝e used generative AI during the 2022-23 academic year, just not primarily for the reasons that teachers fear most. Among youth who completed the nationally representative survey, just 23% said they used it for academic purposes and 19% said they鈥檝e used the tools to help them write and submit a paper. Instead, 29% reported having used it to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Part of the disconnect dividing teachers and students, researchers found, may come down to gray areas. Just 40% of parents said they or their child were given guidance on ways they can use generative AI without running afoul of school rules. Only 24% of teachers say they鈥檝e been trained on how to respond if they suspect a student used generative AI to cheat. 

Center for Democracy and Technology

The results on ChatGPT鈥檚 educational impacts were included in the Center for Democracy and Technology鈥檚 broader annual survey analyzing the privacy and civil rights concerns of teachers, students and parents as tech, including artificial intelligence, becomes increasingly engrained in classroom instruction. Beyond generative AI, researchers observed a sharp uptick in digital privacy concerns among students and parents over last year. 

Among parents, 73% said they鈥檙e concerned about the privacy and security of student data collected and stored by schools, a considerable increase from the 61% who expressed those reservations last year. A similar if less dramatic trend was apparent among students: 62% had data privacy concerns tied to their schools, compared with 57% just a year earlier. 

Center for Democracy and Technology

Those rising levels of anxiety, researchers theorized, are likely the result of the growing frequency of cyberattacks on schools, which have become a primary target for ransomware gangs. High-profile breaches, including in Los Angeles and Minneapolis, have compromised a massive trove of highly sensitive student records. Exposed records, investigative reporting by 社区黑料 has found, include student psychological evaluations, reports detailing campus rape cases, student disciplinary records, closely guarded files on campus security, employees鈥 financial records and copies of government-issued identification cards. 

Survey results found that students in special education, whose records are among the most sensitive that districts maintain, and their parents were significantly more likely than the general education population to report school data privacy and security concerns. As attacks ratchet up, 1 in 5 parents say they鈥檝e been notified that their child鈥檚 school experienced a data breach. Such breach notices, Laird said, led to heightened apprehension. 

鈥淭here鈥檚 not a lot of transparency鈥 about school cybersecurity incidents 鈥渂ecause there鈥檚 not an affirmative reporting requirement for schools,鈥 Laird said. But in instances where parents are notified of breaches, 鈥渢hey are more concerned than other parents about student privacy.鈥 

Parents and students have also grown increasingly wary of another set of education tools that rely on artificial intelligence: digital surveillance technology. Among them are student activity monitoring tools, such as those offered by the for-profit companies Gaggle and GoGuardian, which rely on algorithms in an effort to keep students safe. The surveillance software employs artificial intelligence to sift through students鈥 online activities and flag school administrators 鈥 and sometimes the police 鈥 when they discover materials related to sex, drugs, violence or self-harm. 

Among parents surveyed this year, 55% said they believe the benefits of activity monitoring outweigh the potential harms, down from 63% last year. Among students, 52% said they鈥檙e comfortable with academic activity monitoring, a decline from 63% last year. 

Such digital surveillance, researchers found, frequently has disparate impacts on students based on their race, disability, sexual orientation and gender identity, potentially violating longstanding federal civil rights laws. 

The tools also extend far beyond the school realm, with 40% of teachers reporting their schools monitor students鈥 personal devices. More than a third of teachers say they know a student who was contacted by the police because of online monitoring, the survey found, and Black parents were significantly more likely than their white counterparts to fear that information gleaned from online monitoring tools and AI-equipped campus surveillance cameras could fall into the hands of law enforcement. 

Center for Democracy and Technology

Meanwhile, as states nationwide pull literature from school library shelves amid a conservative crusade against LGBTQ+ rights, the nonprofit argues that digital tools that filter and block certain online content 鈥渃an amount to a digital book ban.鈥 Nearly three-quarters of students 鈥 and disproportionately LGBTQ+ youth 鈥 said that web filtering tools have prevented them from completing school assignments. 

The nonprofit highlights how disproportionalities identified in the survey could run counter to federal laws that prohibit discrimination based on race and sex, and those designed to ensure equal access to education for children with disabilities. In a letter sent Wednesday to the White House and Education Secretary Miguel Cardona, the Center for Democracy and Technology was joined by a coalition of civil rights groups urging federal officials to take a harder tack on ed tech practices that could threaten students鈥 civil rights. 

鈥淓xisting civil rights laws already make schools legally responsible for their own conduct, and that of the companies acting at their direction in preventing discriminatory outcomes on the basis of race, sex and disability,鈥 the coalition wrote. 鈥淭he department has long been responsible for holding schools accountable to these standards.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool鈥檚 鈥渞ole in negatively impacting LGBTQ students.鈥

鈥淥ur philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,鈥 the nonprofit said in the statement. 鈥淲e hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.鈥 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project鈥檚 鈥済uidance on how to do what we do better.鈥 The company also where it previously touted the partnership. 

鈥淲e鈥檙e disappointed that The Trevor Project has decided to pause our collaboration,鈥 she said. 鈥淗owever, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.鈥 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project鈥檚 work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle鈥檚 surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

鈥淚t really does feel like a 鈥榃e paid you, now say we鈥檙e fine,鈥 kind of thing,鈥 said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle鈥檚 鈥渟eal of approval鈥 from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

鈥淧eople who want to defend Gaggle can just point to their little Trevor Project thing and say, 鈥楽ee, they have the support of 鈥淭he Gays鈥 so it鈥檚 fine actually,鈥 and all it does is make it easier to deflect and defend actual issues with Gaggle.鈥 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among 鈥淐orporate Partners鈥 on The Trevor Project鈥檚 website (screenshot)

Following an investigation by 社区黑料 into Gaggle鈥檚 monitoring practices, the company . Gaggle鈥檚 algorithm relies on keyword matching to compare students鈥 online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are 鈥済ay鈥 and 鈥渓esbian,鈥 verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance 鈥 a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it鈥檚 important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle鈥檚 鈥渄esire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.鈥 

鈥淚t鈥檚 true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring 鈥 many worry that these tools could out them to teachers or parents against their will,鈥 the statement continued. 鈥淚t is because of that very real concern that we have worked in a limited capacity with digital safety companies 鈥 to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.鈥 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is 鈥渉onored to be able to align with The Trevor Project to better serve LGBTQ youth,鈥 and that the company is 鈥渁lways looking for ways to learn and to improve upon what we do to better support students and keep them safe.鈥 

鈥楩aceless bureaucracy鈥 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students鈥 school-issued Microsoft and Google accounts, Gaggle seeks to alert educators 鈥 and in some cases the police 鈥 of students’ online behaviors that suggest they might harm themselves or others.

鈥淚t really is about collecting details that kids may not be voluntarily sharing 鈥 information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,鈥 Vance said. At The Trevor Project, 鈥測ou have proactive outreach from youth who know that they need help or they need a community.鈥 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy鈥檚 and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it鈥檚 ultimately up to youth to decide which information they share with adults. 

It鈥檚 important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where 鈥渟ome faceless bureaucracy is finding out and informing your parents鈥 about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn鈥檛 the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic 鈥 one that鈥檚 in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project鈥檚 suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity 鈥 typically called 鈥渙uting鈥 鈥 due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students鈥 civil liberties and to state its intent 鈥渢o take enforcement action against violations that result in discrimination.鈥 The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group鈥檚 co-founder and executive director, said The Trevor Project鈥檚 partnership with Gaggle could be positive if it鈥檚 used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company 鈥渋s proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.鈥 (Screenshot)

鈥淚f that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they鈥檝e done is harm the student,鈥 Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking 鈥渢o avoid liability issues.鈥  

鈥淚t is our stance that since these tools are not going anywhere, we think it鈥檚 important to do our part to offer our expertise around LGBTQ experiences,鈥 the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

鈥淚 have friends who are queer and/or trans who are out at school but not to their parents,鈥 he said. 鈥淚f they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.鈥 

In The Trevor Project鈥檚 recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like 鈥済ay鈥 and 鈥渓esbian鈥 can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he鈥檚 been called just about every 鈥渟lur for a queer or a trans person that isn鈥檛 from like 80 years ago.鈥 While slurs are common, terms like 鈥渓esbian鈥 are not.

鈥淎s an actual teenager going to an actual public school, those words are not being used to bully people,鈥 he said. 鈥淭hey鈥檙e just not.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Minneapolis Schools to Halt Controversial Student Surveillance Initiative /article/minneapolis-schools-to-halt-controversial-student-surveillance-initiative/ Mon, 27 Jun 2022 19:56:23 +0000 /?post_type=article&p=692269 The Minneapolis school district has announced plans to end its relationship with Gaggle, a controversial digital surveillance tool that monitored students鈥 online behaviors during pandemic-induced remote learning. 

The announcement, which follows extensive reporting by 社区黑料 about how the tool subjected the city鈥檚 youth to pervasive round-the-clock digital surveillance, was outlined last week at the bottom of a newsletter alerting families to changes at the district. Gaggle, which uses artificial intelligence and human content moderators to track students鈥 online activities and notify district officials of 鈥渋nappropriate behaviors or potential threats to self or others,鈥 will no longer be used beginning on July 1, the district announced. 

A week after schools went remote in Minneapolis and nationally in March 2020, the district sidestepped typical procurement rules and used federal pandemic relief money to contract with Gaggle, a for-profit company that reported significant business growth when classes went online. The district has spent more than $355,000 on the tool, which monitors student behaviors on school-issued Google and Microsoft accounts, and has a contract with the company through September 2023. 

District officials said the tool saved lives but civil rights advocates and students targeted by the program have questioned its efficacy and accused the company of violating students鈥 privacy rights. 

In an email, district spokesperson Julie Schultz Brown attributed the change to 鈥渕ade in order to honor the terms of our new contract鈥 with educators. Gaggle founder and CEO Jeff Patterson said the Minneapolis district will stop using the tool at a moment when 鈥渟tudents across the United States are suffering.鈥 In June, the company alerted Minneapolis officials to 15 鈥渃ritical incidents鈥 related to suicide, death threats, violence and drug use, Patterson wrote in a statement. Nationally, the pandemic has led to a surge in youth mental health issues and . 

A recent report by Democratic Sens. Elizabeth Warren and Ed Markey warned that Gaggle and similar services could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars. Gaggle claims it during the 2020-21 school year, yet independent research on the tool鈥檚 effectiveness doesn鈥檛 exist. 

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Teeth Logsdon-Wallace, a rising freshman in Minneapolis, saw the district鈥檚 decision to cut ties with Gaggle as a major victory. He became an outspoken Gaggle critic after a homework assignment, which discussed a previous suicide attempt and how he learned important coping skills, got flagged by the tool鈥檚 surveillance dragnet. Officials at Gaggle and the district said the tool helps identify students who are struggling emotionally and need adult intervention. But 14-year-old Logsdon-Wallace and other critics argue that digital surveillance is an inappropriate way to pinpoint students who need mental health care. Rather than helping, he said the experience 鈥渇elt violating and gross.鈥 

鈥淲hen you鈥檙e spying on kids and their stuff, especially about mental health stuff, they鈥檙e just going to be more secretive about it,鈥 he said. 鈥淭hat can just cause more danger.鈥

While Gaggle relies on technology to ferret out students with issues like depression, Logsdon-Wallace said that he and other students are more likely to share their mental health struggles with adults at school if there鈥檚 a culture of trust. Monitoring communications through an algorithm and a team of low-paid remote workers who the students don鈥檛 even know, he said, had the opposite effect and left students more apprehensive about district computers, 鈥渨hich could be positive and negative.鈥

While his peers learned how to better protect their own privacy online 鈥渆ven when it鈥檚 inherently being violated,鈥 he said, he worried that some may have been 鈥渂ottling up mental health issues because of it.鈥

The district will no longer use Gaggle鈥檚 student activity monitoring tool or the company鈥檚 anonymous tip line, SpeakUp for Safety, which allows students to report potential safety threats confidentially. Instead of turning to SpeakUp, concerned parents and students should report issues to police officials with the state Bureau of Criminal Apprehension, the district wrote in its newsletter. 

District officials have said the anonymous tip line was central to its decision to contract with Gaggle, yet previous reporting by 社区黑料 found that the service was rarely used. Meanwhile, the digital surveillance tool routinely flagged students who made references to sex, drugs and violence on district technology. An analysis of nearly 1,300 alerts found the service flagged Minneapolis students for discussing violent impulses, eating disorders, abuse at home and suicidal plans. 

But Gaggle regularly flagged benign student chatter and personal files, including classroom assignments, casual conversations between teens and sensitive journal entries. Gaggle flags students who use keywords related to sexual orientation including 鈥済ay鈥 and 鈥渓esbian,鈥 and on at least one occasion school officials in Minneapolis outed an LGBT student to their parents. The sheer volume of student communications that got flagged by Gaggle was at times overwhelming, the Minneapolis school district鈥檚 head of security acknowledged, but he also felt like he was able to save students from dying by suicide. 

In interviews with 社区黑料, former content moderators at Gaggle 鈥 hundreds of whom are paid just $10 an hour on month-to-month contracts 鈥 raised serious questions about the company鈥檚 efficacy, its employment practices and its effect on students鈥 civil rights. 

Moderators said they received little training before they were given access to students鈥 sensitive materials and were pressured to prioritize speed over quality. They also reported insufficient safeguards to protect students鈥 sensitive files, including nude selfies. Patterson acknowledged that moderators, who work remotely with little supervision or oversight, could easily save copies of students鈥 nude photographs and share them on the dark web. 

As a transgender teenager who believes the school district has done too little to address bullying, Logsdon-Wallace said he already had little trust in district leaders. While Gaggle didn鈥檛 address the abuse from peers, having his sensitive experiences caught in the company鈥檚 algorithm made the situation worse.

鈥淭he very little trust I had in the administration is just destroyed,鈥 he said. 鈥淵ou can鈥檛 expect students to trust you if you鈥檝e done nothing to earn that trust.鈥

]]>
Philly Schools Are Screening Middle Schoolers for Weapons /article/philly-schools-are-screening-middle-schoolers-for-weapons/ Sat, 14 May 2022 12:31:00 +0000 /?post_type=article&p=589238 The School District of Philadelphia is now periodically screening students for weapons in an effort to combat gun violence.

Effective Monday, 6th through 8th grade students will be subject to the screenings which will happen in the mornings when students first come to school. The screenings will be done at six schools per day and will be conducted at every middle school and elementary schools with middle grades. The district was already conducting screenings at high schools.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The measure comes as Philadelphia is experiencing an uptick in gun violence.

鈥淲e understand that there are mixed emotions about this,鈥  district spokesperson Monica Lewis said of the move.

鈥淭here are some who are in favor of because they know that it鈥檚 an effort to keep students and staff safe and we know that there are some who aren鈥檛 in favor of because they feel like it鈥檚 intrusive and we understand that and we respect their opinion, but we hope that they understand that anything that we are doing is with the safety and the well-being of the students and staff in mind,鈥 Lewis said.

The screenings will be done throughout the remainder of the school year which ends on June 11.

The screenings are being conducted by a team of School Safety personnel and will include searches by a hand wand or metal detector and a physical check of all bags, backpacks and personal items.

The district said any student in possession of a firearm will be detained by School Safety and referred to the Philadelphia Police Department.

Lewis said the district will not be commenting on whether any weapons were found during the process.

The district said it is giving students the opportunity to dispose of any illegal or inappropriate items prior to being screened without consequence.

Ayana Jones is a reporter for the Philadelphia Tribune, . 

is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Pennsylvania Capital-Star maintains editorial independence. Contact Editor John Micek for questions: info@penncapital-star.com. Follow Pennsylvania Capital-Star on and .

]]>
Meet the Gatekeepers of Students鈥 Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn鈥檛 want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students鈥 emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids鈥 deepest secrets 鈥 like nude selfies and suicide notes 鈥 regularly flashed onto Waskiewicz鈥檚 screen. Though she felt 鈥渁 little bit like a voyeur,鈥 she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle鈥檚 moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment鈥檚 notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

鈥淚n all honesty I was sort of half-assing it,鈥 Waskiewicz admitted in an interview with 社区黑料. 鈥淚t wasn鈥檛 enough money and you鈥檙e really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students鈥 private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about 鈥渁 tsunami of youth suicide headed our way鈥 and said that schools have 鈥渁 moral obligation to protect the kids on their digital playground.鈥 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company鈥檚 efficacy, its employment practices and its effect on students鈥 civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers 鈥 either in-person, on the phone or over Zoom 鈥 before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students鈥 sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn鈥檛 sleep and without 鈥渁ny money to show for what I was putting up with.鈥

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors鈥 role with the company, arguing they use 鈥渃ommon sense鈥 to distinguish false flags generated by the algorithm from potential threats and do 鈥渘ot require substantial training.鈥 

While the experiences reported by Gaggle鈥檚 moderator team platforms like Meta-owned Facebook, Patterson said his company relies on 鈥淯.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,鈥 as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

鈥淪ome people are not fast decision-makers. They need to take more time to process things and maybe they鈥檙e not right for that job,鈥 he told 社区黑料. 鈥淔or some people, it鈥檚 no problem at all. For others, their brains don鈥檛 process that quickly.鈥

Executives also sought to minimize the contractors鈥 access to students鈥 personal information; a spokeswoman said they only see 鈥渟mall snippets of text鈥 and lacked access to what鈥檚 known as students鈥 鈥減ersonally identifiable information.鈥 Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students鈥 names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to 鈥済ray areas,鈥 such as whether a Victoria鈥檚 Secret lingerie ad would be considered acceptable or not. 

鈥淭hose people are really just the very, very first pass,鈥 Gaggle spokeswoman Paget Hetherington said. 鈥淚t doesn鈥檛 really need training, it鈥檚 just like if there鈥檚 any possible doubt with that particular word or phrase it gets passed on.鈥 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

鈥淚 went into the experience extremely excited to help children in need,鈥 McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. 鈥淚 realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that鈥檚 seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about 鈥渓ives saved鈥 and child safety incidents at every meeting, and they are open about sharing the company鈥檚 financial outlook so that employees 鈥渃an have confidence in the security of their jobs.鈥

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle鈥檚 content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

鈥榃e are just expendable鈥

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a 鈥渉ighly trained content review team鈥 to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle鈥檚 content review team, described their training as 鈥渁 joke,鈥 consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company鈥檚 safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were 鈥渕ore confused than when we started.鈥

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators鈥 feedback to 社区黑料.

鈥淚f you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,鈥 one on Indeed. 鈥淲arning, you will see awful awful things. No they don鈥檛 provide therapy or any kind of support either.

鈥淭hat isn鈥檛 even the worst part,鈥 the reviewer continued. 鈥淭he worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn鈥檛 be able to function, but we are just expendable.鈥 

As the first layer of Gaggle鈥檚 human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students鈥 communications for additional consideration. Designated employees on Gaggle鈥檚 Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students鈥 files, Patterson said.

Gaggle鈥檚 staunchest critics have questioned the tool鈥檚 efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students鈥 civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with 社区黑料 鈥渟truck me as the worst-case scenario,鈥 said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators鈥 limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, 鈥渋s not acceptable.鈥

In to lawmakers, Gaggle described a two-tiered review procedure but didn鈥檛 disclose that low-wage contractors were the first line of defense. CEO Patterson told 社区黑料 they 鈥渄idn鈥檛 have nearly enough time鈥 to respond to lawmakers鈥 questions about their business practices and didn鈥檛 want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren鈥檛 interviewed before getting placed on the job.

鈥淭here鈥檚 a lot of contractors. We can鈥檛 do a physical interview of everyone and I don鈥檛 know if that’s appropriate,鈥 he said. 鈥淚t might actually introduce another set of biases in terms of who we hire or who we don鈥檛 hire.”

鈥極ther eyes were seeing it鈥

In a previous investigation, 社区黑料 analyzed a cache of public records to expose how Gaggle鈥檚 algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools鈥 authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle鈥檚 algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students鈥 online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students鈥 online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students鈥 online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

鈥淚 felt kind of bad because the kids didn鈥檛 have the ability to have stuff of their own and I wondered if they realized that it was public,鈥 she said. 鈥淚 just wonder if they realized that other eyes were seeing it other than them and their little friends.鈥

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students鈥 computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students鈥 screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don鈥檛 share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children鈥檚 activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by 社区黑料鈥檚 reporting, said Elizabeth Laird, the group鈥檚 director of equity in civic technology. 

鈥淚 don鈥檛 know that the way this information is being handled actually would meet parents鈥 expectations,鈥 Laird said. 

Another former contractor, who reached out to 社区黑料 to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids鈥 private lives 鈥 including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn鈥檛 come with health insurance. 

鈥淚 went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,鈥 said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. 鈥淚t broke my heart that they had to go through these revelations about themselves in a context where they can鈥檛 even go to school and get out of the house a little bit. They have to do everything from home 鈥 and they鈥檙e being constantly monitored.鈥 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to 社区黑料 by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they鈥檙e warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

鈥淨uite honestly, we鈥檙e dealing with school districts with very limited budgets,鈥 Patterson said. 鈥淭here have to be some tradeoffs.鈥 

The anonymous contractor said he wasn鈥檛 as concerned about his own well-being as he was about the welfare of the students under the company鈥檚 watch. The company lacked adequate safeguards to protect students鈥 sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students鈥 nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn鈥檛 aware of any data breaches. 

鈥淲e do things in the interface to try to disable the ability to save those things,鈥 Patterson said, but 鈥測ou know, human beings who want to get around things can.鈥

鈥楳ade me feel like the day was worth it鈥

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

鈥淚t was a little weird when they were asking for the banking information, like 鈥榃ait a minute is this real or what?鈥欌 Waskiewicz said. 鈥淚 Googled them and I think they鈥檙e pretty big.鈥

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student鈥檚 suicide note. 

鈥淜nowing I was able to help with that made me feel like the day was worth it,鈥 she said. 鈥淗earing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.鈥 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district鈥檚 contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student鈥檚 suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district鈥檚 code of conduct. 

鈥淣o tool is perfect, every organization has room to improve, I鈥檓 sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,鈥 said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

鈥淭here鈥檚 always going to be pros and cons to any organization, any service,鈥 Enfield told 社区黑料, 鈥渂ut our experience has been overwhelmingly positive.鈥

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company鈥檚 artificial intelligence lacked sophistication. They said the algorithm routinely flagged students鈥 papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that鈥檚 long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that 鈥99% of the time鈥 Gaggle鈥檚 algorithm flagged pedestrian materials including pictures of sunsets and student鈥檚 essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing 鈥渢he right thing.鈥

McElligott said that managers鈥 personal opinions added another layer of complexity. Though moderators were 鈥渉eld to strict rules of right and wrong decisions,鈥 she said they were ultimately 鈥渂eing judged against our managers鈥 opinions of what is concerning and what is not.鈥 

鈥淚 was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,鈥 she said. 鈥淭here was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up 鈥 and when I alerted it, I was told it was not as serious as I thought.鈥 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are 鈥渟o concerned about students.鈥 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle鈥檚 surveillance dragnet, and pressure to work quickly didn鈥檛 offer enough time to evaluate long chat logs between students having 鈥渉eartfelt and sensitive鈥 conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

鈥淲hen I would see stuff like that I was like 鈥極h, thank God, I can just get this out of the way and heighten how many items per hour I鈥檓 getting,鈥欌 he said. 鈥淚t鈥檚 like 鈥業 hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.鈥欌 

Ultimately, he said he was unprepared for such extensive access to students鈥 private lives. Because Gaggle鈥檚 algorithm flags keywords like 鈥済ay鈥 and 鈥渓esbian,鈥 for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to 鈥渆nsure that these vulnerable students are not being harassed or suffering additional hardships,鈥 but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

鈥淚 thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,鈥 the former moderator said. 鈥淚 felt tremendous power was being put in my hands鈥 to distinguish students鈥 benign conversations from real danger, 鈥渁nd I was given that power immediately for $10 an hour.鈥 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle鈥檚 watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it鈥檚 鈥渏ust really freaky鈥 that moderators can review students鈥 sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle鈥檚 content review team. 

鈥淣ot only is it violating the privacy rights of students, which is bad for our mental health, it鈥檚 traumatizing these moderators, which is bad for their mental health,鈥 he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

鈥淏ad labor conditions don鈥檛 just affect the workers,鈥 he said. 鈥淚t affects the people they say they are helping.鈥 

Gaggle cannot prohibit contractors from reviewing students鈥 private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

鈥淗owever, the contractors know the nature of the content they will be reviewing,鈥 Durkac said. 鈥淚t is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.鈥 

Gaggle鈥檚 former contractors also weighed students’ privacy rights. Heyman said she 鈥渨ent back and forth鈥 on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

鈥淚f you don鈥檛 want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,鈥 she said. 鈥淎s long as they鈥檙e being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.鈥 

Logsdon-Wallace and his mother said they didn鈥檛 know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle鈥檚 algorithm helped him understand the effects that surveillance can have on young people. 

鈥淪ometimes a kid would use a curse word and another kid would be like, 鈥楧ude, shut up, you know they鈥檙e watching these things,鈥欌 he said. 鈥淭hese kids know that they鈥檙e being looked in on,鈥 even if they don鈥檛 realize their observer is a contractor working from the couch in his living room. 鈥淎nd to be the one that is doing that 鈥 that is basically fulfilling what these kids are paranoid about 鈥 it just felt awful.鈥 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded 社区黑料 and sits on its board of directors.

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it鈥檚 鈥楴ot That Smart鈥 /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone鈥檚 gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since 鈥済raduated鈥 from weekly therapy sessions and has found a better headspace, but that didn鈥檛 stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope 鈥 intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song 鈥淵our Heart is a Muscle the Size of Your Fist鈥 helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was 鈥渁 reminder to keep on loving, keep on fighting and hold on for your life.鈥 (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, 社区黑料 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company鈥檚 digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle鈥檚 surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word 鈥渟uicide,鈥 context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment 鈥 that his mental health had improved 鈥 was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 鈥淚 was trying to be vulnerable with this teacher and be like, 鈥楬ey, here鈥檚 a thing that鈥檚 important to me because you asked,鈥 Logsdon-Wallace said. 鈥淣ow, when I鈥檝e made it clear that I鈥檓 a lot better, the school is contacting my counselor and is freaking out.鈥

Jeff Patterson, Gaggle鈥檚 founder and CEO, said in a statement his company does not 鈥渕ake a judgement on that level of the context,鈥 and while some districts have requested to be notified about references to previous suicide attempts, it鈥檚 ultimately up to administrators to 鈥渄ecide the proper response, if any.鈥  

鈥楢 crisis on our hands鈥

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students鈥 online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students鈥 emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by 社区黑料 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic鈥檚 effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

鈥淏efore the pandemic, we had a crisis on our hands,鈥 he said. 鈥淚 believe there鈥檚 a tsunami of youth suicide headed our way that we are not prepared for.鈥 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there鈥檚 to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace鈥檚 mother Alexis Logsdon didn鈥檛 know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

鈥淭hat was an example of somebody describing really good coping mechanisms, you know, 鈥業 have music that is one of my soothing activities that helps me through a really hard mental health time,鈥欌 she said. 鈥淏ut that doesn鈥檛 matter because, obviously, this software is not that smart 鈥 it鈥檚 just like 鈥榃oop, we saw the word.鈥欌 

鈥楻andom and capricious鈥

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications 鈥 an experience she described as 鈥渞eally scary.鈥

鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of 鈥淚nappropriate Use鈥 while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school鈥檚 literary journal and, according to her, Gaggle had ultimately flagged profanity in students鈥 fictional article submissions. 

鈥淭he link at the bottom of this email is for something that was identified as inappropriate,鈥 Gaggle warned in its email while pointing to one of the fictional articles. 鈥淧lease refrain from storing or sharing inappropriate content in your files.鈥 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn鈥檛 catch everything. Even as she got flagged when students shared documents with her, the articles鈥 authors weren鈥檛 receiving similar alerts, she said. And neither did Gaggle鈥檚 AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle鈥檚 monitoring system is 鈥渞andom and capricious,鈥 and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

鈥淲ith such a seemingly random service, that doesn鈥檛 seem to 鈥 in the end 鈥 have an impact on improving student health or actually taking action to prevent suicide and threats鈥 she said in an interview. 鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times 鈥渄oes not properly indicate the author of a document and assigns a random collaborator.鈥

鈥淲e are hoping Google will improve this functionality so we can better protect students,鈥 Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn鈥檛 notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she鈥檇 shoot her 鈥減uny little brain with my grandpa鈥檚 rifle.鈥

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter鈥檚 teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

鈥淚 didn鈥檛 hear a word from Gaggle about it,鈥 she said. 鈥淚f I hadn鈥檛 brought it to the teacher鈥檚 attention, I don鈥檛 think that anything would have been done.鈥 

The incident, which occurred in April, fell outside the six-month period for which 社区黑料 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it 鈥渄oes not have any insight into the steps the district took to address this particular matter.鈥 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials 鈥渨ould never discuss with a community member any communication flagged by Gaggle.鈥 

鈥淭hat unrelated but concerned parent would not have been provided that information nor should she have been,鈥 she wrote in an email. 鈥淭hat is private.鈥 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

鈥楾he big scary algorithm鈥

When identifying potential trouble, Gaggle鈥檚 algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they鈥檙e delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That鈥檚 where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 This is not Netflix鈥檚 recommendation engine. This is not Spotify.鈥
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

鈥淵ou鈥檙e going to get 25,000 emails saying that a student dropped an F-bomb in a chat,鈥 she said. 鈥淲hat鈥檚 the utility of that? That seems pretty low.鈥 

She said that Gaggle鈥檚 utility could be impaired because it doesn鈥檛 adjust to students鈥 behaviors over time, comparing it to Netflix, which recommends television shows based on users鈥 ever-evolving viewing patterns. 鈥淪omething that doesn鈥檛 learn isn鈥檛 going to be accurate,鈥 she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle鈥檚 marketing materials appear to overhype the tool鈥檚 sophistication to schools, she said. 

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 she said. 鈥淭his is not Netflix鈥檚 recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.鈥 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart.鈥
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle鈥檚 proprietary algorithm is updated regularly 鈥渢o adjust to student behaviors over time and improve accuracy and speed.鈥 The tool monitors 鈥渢housands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.鈥 

Ultimately, the algorithm to identify keywords is used to 鈥渘arrow down the haystack as much as possible,鈥 Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart,鈥 he said. 

In Minneapolis, officials denied that Gaggle infringes on students鈥 privacy and noted that the tool only operates within school-issued accounts. The district鈥檚 internet use policy states that students should 鈥渆xpect only limited privacy,鈥 and that the misuse of school equipment could result in discipline and 鈥渃ivil or criminal liability.鈥 District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor 鈥渢he online activities of minors.鈥 

Patterson suggested that teachers aren鈥檛 paying close enough attention to keep students safe on their own and 鈥渟ometimes they forget that they鈥檙e mandated reporters.鈥 On the , Patterson says he launched the company in 1999 to provide teachers with 鈥渁n easy way to watch over their gaggle of students.鈥 Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company鈥檚 role in meeting it. As technology becomes a key facet of American education, Patterson said that schools 鈥渉ave a moral obligation to protect the kids on their digital playground.鈥 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student 鈥渢racking鈥 through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn鈥檛 be 鈥渃onstrued to require the tracking of internet use by any identifiable minor or adult user.鈥 In , her group urged the government to clarify the Children鈥檚 Internet Protection Act鈥檚 requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they鈥檙e concerned the tools 鈥渕ay extend beyond鈥 the law鈥檚 intent 鈥渢o surveil student activity or reinforce biases.鈥 Around-the-clock surveillance, they wrote, demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥 

鈥淓scalations and mischaracterizations of crises may have long-lasting and harmful effects on students鈥 mental health due to stigmatization and differential treatment following even a false report,鈥 the senators wrote. 鈥淔lagging students as 鈥榟igh-risk鈥 may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.鈥

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd鈥檚 murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle鈥檚 algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by 社区黑料 offer a limited window into Gaggle鈥檚 potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students鈥 digital communications are forwarded to police in rare circumstances. The Minneapolis district鈥檚 internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district鈥檚 Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district鈥檚 director of emergency management, safety and security, said that law enforcement is not a 鈥渞egular partner,鈥 when responding to incidents flagged by Gaggle. It doesn鈥檛 deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by 社区黑料.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

鈥淓ven if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,鈥 Matlock said, though it鈥檚 unclear if any students have faced legal consequences. 鈥淚t鈥檚 the question as to why they鈥檙e doing it,鈥 and to raise the issue with their parents.

Gaggle鈥檚 keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including 鈥済ay, and 鈥渓esbian.鈥 On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident 鈥渄isgusting and horribly messed up.鈥 

鈥淭hey have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it鈥檚 going to be false-positive because they are acting as if the word gay is inherently sexual,鈥 he said. 鈥淲hen people are just talking about being gay, anything they鈥檙e writing would be flagged.鈥 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in 社区黑料鈥檚 data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

鈥淭hat鈥檚 definitely really messed up, especially when the school is like 鈥極h no, no, no, please keep these Chromebooks over the summer,鈥欌 an invitation that gave students 鈥渢he go-ahead to use them鈥 for personal reasons, he said.

鈥淓specially when it鈥檚 during a pandemic when you can鈥檛 really go anywhere and the only way to talk to your friends is through the internet.鈥

]]>
Dems Warn School Surveillance Tools Could Compound 鈥楻isk of Harm for Students鈥 /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they鈥檙e taking to ensure the tools aren鈥檛 鈥渦nfairly targeting students and perpetuating discriminatory biases,鈥 and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students鈥 online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淓ducation technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,鈥 the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions 鈥 and grew rapidly as schools shifted to remote learning during the pandemic 鈥 there’s . Some critics, including the lawmakers, argue they may do more harm than good. 鈥淭he use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,鈥 the senators wrote.

The letters cited a recent investigation by 社区黑料, which outlined how Gaggle鈥檚 AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students鈥 classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students鈥 school-issued Google and Microsoft accounts. Other services include students鈥 social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools鈥 capacity to track student behaviors 24/7 鈥 including when students are at home 鈥 and their ability to monitor students on their personal devices in some cases.

Schools鈥 use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

鈥淏ecause of the lack of transparency, many students and families are unaware that nearly all of their children鈥檚 online behavior is being tracked,鈥 according to the letters. 鈥淲hen students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.鈥

A Securly spokesperson said in an email the company is 鈥渞eviewing the correspondence received鈥 by the lawmakers and is in the process of responding to their requests for information. He said the company is 鈥渄eeply committed to continuously evolving our technology鈥 to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers鈥 interest in learning how the tool 鈥渟erves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.鈥 A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn鈥檛 respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students鈥 internet use to ensure they aren鈥檛 accessing material that is 鈥渉armful to minors,鈥 such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law鈥檚 scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not 鈥渞equire the tracking of internet use by any identifiable minor or adult user.鈥 It 鈥渞emains an open question鈥 as to whether schools鈥 use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

鈥淪chool disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,鈥 according to the letters. 鈥淭hese disciplinary records, even when students are cleared, may have life-long harmful consequences for students.鈥

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research 鈥渞evealed a worrisome lack of transparency鈥 around how these educational technology companies track students online and how schools rely on their tools.

鈥淩esponses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,鈥 she said in an email.

]]>
Report: Most Parents, Teachers Support Student Surveillance Tech /article/new-research-most-parents-and-teachers-have-accepted-student-surveillance-as-a-safety-tool-but-see-the-potential-for-serious-harm/ Tue, 21 Sep 2021 16:30:00 +0000 /?post_type=article&p=577984 Tools that monitor students鈥 online behavior have become ubiquitous in U.S. schools 鈥 and grew rapidly as the pandemic closed campuses nationwide 鈥 but a majority of parents and teachers believe the benefits of such digital surveillance outweigh the risks, .

Similarly, half of students said they are comfortable with schools鈥 use of monitoring software while a quarter reported feeling queasy about the idea, according to the new research by the Center for Democracy and Technology, a nonprofit group based in Washington, D.C. Despite their overall comfort with digital software, teachers, parents and students each worried about how the tools could have detrimental side effects. Specifically, many parents and teachers were concerned that digital surveillance could be used to discipline students and young people reported becoming more reserved when they knew they were being watched.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淚n response to the pandemic, the focus on technology and its use has never been greater,鈥 said report co-author Elizabeth Laird, the center鈥檚 director of equity in civic technology. As tech gains a greater grasp on education, she said it鈥檚 important for school leaders and policymakers to remain focused on protecting students鈥 individual rights. She worried that student surveillance technology could have a damaging impact on students, especially youth of color and those from low-income households.

鈥淚 don鈥檛 think it鈥檚 a slam dunk,鈥 Laird said.

Though the report didn鈥檛 highlight specific tools used, schools deploy a range of digital monitoring software to track student activity, including programs that block online material deemed inappropriate, track when students log into school applications, and allow teachers to view students鈥 screens in real-time and even take control of their computers.

Last week, an investigative report by 社区黑料 exposed how the Minneapolis school district鈥檚 use of the digital surveillance tool Gaggle had subjected children to relentless online surveillance long after classes ended for the day 鈥 including inside students鈥 homes. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day by sifting through data stored on their school-issued Google and Microsoft accounts. In Minneapolis, the company flagged school security when moderators believed students could harm themselves or others, but it also picked up students鈥 classroom assignments, journal entries, chats with friends and fictional stories.

Among teachers surveyed by the Center for Democracy and Technology, 81 percent said their schools use software that tracks students鈥 computer activity, including to block obscene material, monitor students鈥 screens in real time and prohibit students from using websites unrelated to school like YouTube. A majority of both parents and students reported such tools were used in their schools, but they were also more likely than teachers to be unsure about whether youth were being actively monitored by educators. In interviews with administrators, researchers found that many school leaders weren鈥檛 sure how best to be transparent with families about their monitoring practices.

鈥淐ertainly there is an imbalance in information and transparency around what is happening,鈥 Laird said. School districts have been clear [that] students shouldn鈥檛 have an expectation of privacy but they haven鈥檛 been as clear about what they are tracking, how they are tracking it, how long they keep that information. They really should be doing that.鈥

Four-fifths of surveyed teachers said their schools used digital tools to track students online. Both parents and students were more unlikely than teachers to be unsure whether such tools were in use in their schools. (Photo courtesy Center for Democracy and Technology)

Among teachers, 66 percent said the benefits of activity monitoring outweigh student privacy concerns and 62 percent of parents reached a similar conclusion. Meanwhile, 78 percent of teachers reported that digital surveillance helps keep students safe by identifying problematic online behaviors and 72 percent said it helps keep students on task. But their answers also revealed equity concerns: 71 percent of teachers reported that monitoring software is applied to all students equally, 51 percent worried that it could come with unintended consequences like 鈥渙uting鈥 LGBTQ students and 49 percent said it violates students鈥 privacy.

Many teachers reported that such monitoring tools are used on students long after classes end for the day. In total, 30 percent of educators said the tools are active 鈥渁ll of the time,鈥 and 16 percent said the software tracks kids on their personal devices.

Nearly a third of teachers who reported their schools use digital services like Gaggle to track students online said the tools monitor youth behaviors 24 hours a day. (Photo by Center for Democracy and Technology)

Among parents, 75 percent said digital surveillance helps keep students safe and 73 percent said it ensures children remain focused on schoolwork. Yet many parents also reported potential downsides: 61 percent worried of long-term harm if the tools were used to discipline students, 51 percent were concerned about unintended consequences and 49 percent said it violates students鈥 privacy rights.

Perhaps unsurprisingly, students were less at ease with educators watching their online behaviors. Half said they were comfortable with monitoring tools, a quarter said they were uncomfortable with them and another quarter were unsure.

The data also suggest that students alter their behaviors as a result of being watched: 58 percent said they don鈥檛 share their true thoughts or ideas online as a result of being monitored at school and 80 percent said they were more careful about what they search online. While just 39 percent of students said it was unfair that educators monitored their school-issued services, 74 percent opposed the surveillance of their own devices like their cell phones. are among those that could track students鈥 behaviors on their own technology.

The data raise significant equity concerns. For many students, school-issued devices are their only method of connectivity.

鈥淭he privacy and security of personal devices is a luxury not all can afford,鈥 Alexandra Givens, the center鈥檚 president and CEO, said in a press release. 鈥淐onstant online monitoring 鈥 especially of students who cannot afford or don鈥檛 have access to personal devices 鈥 risks creating disparities in the ways student privacy is protected nationwide.鈥

To reach its findings, researchers conducted online surveys in June that were completed by 1,001 teachers, 1,663 parents and 420 high school students. Researchers also conducted interviews with school administrators to understand their motives in deploying digital surveillance. Among the justifications is a federal law that requires schools to monitor students online. But the law also includes a disclaimer noting that the statute does not 鈥渞equire the tracking of internet use by any identifiable minor or adult user.鈥

Understanding context is critical, Laird said, adding that the law鈥檚 authors hadn鈥檛 fully envisioned a world where students could be surveilled by artificial intelligence long after classes end for the day.

鈥淲hat was happening at the time was students were in a school computer lab for part of the day and monitoring meant having an adult walking around a computer lab and physically looking at what was on students鈥 computer monitors,鈥 she said. But today, she said the statute is being interpreted very differently.

In response, the center, along with the American Civil Liberties Union and the Center for Learner Equity Tuesday to clarify the law鈥檚 stipulations and inform educators it 鈥渄oes not require broad, invasive and constant surveillance of students鈥 lives online.鈥

鈥淪ystemic monitoring of online activity can reveal sensitive information about students鈥 personal lives, such as their sexual orientation, or cause a chilling effect on their free expression, political organizing, or discussion of sensitive issues such as mental health,鈥 the letter continued. 鈥淭hese harms likely fall disproportionately on already vulnerable, over-policed and over-disciplined communities.鈥

]]>