student surveillance – 社区黑料 America's Education News Source Sun, 20 Apr 2025 18:24:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png student surveillance – 社区黑料 32 32 鈥楽py High:鈥 Amazon Documentary Probes Dangers of Online Student Surveillance /article/spy-high-amazon-documentary-probes-dangers-of-online-student-surveillance/ Mon, 21 Apr 2025 12:30:00 +0000 /?post_type=article&p=1013855 It all began with a pixelated image of a Mike and Ike: the colorful, fruity candy that with a digital blur and authorities鈥 preconceived notions could perhaps be mistaken for a pill. 

That鈥檚 what happened to 15-year-old Blake Robbins, who was accused by officials in Pennsylvania鈥檚 affluent Lower Merion School District of dealing drugs in 2009 after they surreptitiously snapped a photo of him at home with the chewy candy in his hand. The moment was captured by the webcam on his school-issued laptop, one of some 66,000 covert student images collected by the district, including one of Robbins asleep in his bed. 

Robbins sued and the subsequent case, dubbed 鈥淲ebcamGate,鈥 is at the center of now streaming on Amazon Prime, that examines the high-profile student surveillance scandal and the explosion of student privacy threats that followed it.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The Lower Merion School District, which settled the class-action lawsuit, was an early adopter of one-to-one computer education technology programs that provide school-issued laptops to students. Such programs have since , particularly since the pandemic. So, too, have digital surveillance tools like Gaggle and GoGuardian, which alert educators when students express thoughts of self-harm or discuss topics deemed taboo, like sex, violence or drugs. 

Directed by Jody McVeigh-Schultz and executive produced by Mark Wahlberg, the documentary offers a cautionary tale about what happens when student monitoring initiatives 鈥 often intended to promote young people鈥檚 safety and well-being 鈥 go awry. It also explores how covert student surveillance intersects with far-reaching school equity issues involving race, disability, privilege and discipline. 

After years of reporting on digital student surveillance myself, I caught up last week with McVeigh-Schultz, whose other documentaries include about reality TV鈥檚 seemingly wholesome Duggar family and the Emmy-nominated which delves into the brutal 1960 killing of three women in an Illinois state park. We talked about what he wants viewers to take away from the Robbins鈥 scandal 15 years after it unfolded and the lessons it holds for contemporary student privacy debates and schools鈥 growing reliance on ed tech. 

The interview was edited for length and clarity. 

What motivated you to take a deep dive into the Robbins case, and why is it important right now?

I grew up just outside of Philly in a suburb called Cheltenham and I had heard about this story. I knew Lower Merion as the high school that Kobe [Bryant] went to. That鈥檚 what it was famous for, but I knew about the Robbins story and I was like, 鈥淭hat鈥檚 crazy,鈥 when I heard about it back in 2010 and then I kind of never heard anything more about it. It was a really big story and then just kind of went away. 

When we talked to folks from wealthy suburbs outside of Philadelphia, I think it’s very clear to me that one of the key indicators of status is education. It鈥檚 more important than anything else to people. 

The public schools in Lower Merion are really highly rated and people care a ton about the quality of the education and the image of the institution. What are the real world implications of that? 

In this case, the way it played out, some of the things that happened were counterintuitive. Many folks from that community didn’t want to see a lawsuit come to bear against their school. It was like, 鈥淥h well, you know, this actually is perhaps going to affect our home values,鈥 if you鈥檙e selling your home and the biggest selling point is the quality of the education.

Blake Robbins, then a high school student in Pennsylvania’s affluent Lower Merion School District, speaks to the press about his 2010 lawsuit alleging covert digital surveillance by educators. (Unrealistic Ideas)

That’s something that you wouldn’t expect to be one of the first reactions to finding out that the schools may be surveilling your kids. But it was, and the fact that the Robbins family had lived in the community for a long time but just weren鈥檛 considered part of the in-group just because of who they were was very interesting and, I think, led to people being skeptical of them.

The documentary leaves it up to you to decide whether that skepticism is deserved or not.

Absolutely. The documentary certainly highlights how people are complex and have complicated stories. What did you learn about debates over personal privacy, especially when it comes to information about children?

People’s expectations of how much privacy you should be afforded, and how much you should expect without having to ask any questions, those expectations vary a lot. 

Somebody who was interviewed in a news piece that ran in 2010 said, 鈥淵ou know, this is the school district鈥檚 laptop, they could tap in at any time and rightfully so.鈥 I鈥檓 a parent, I have a 2-year-old and a 7-year-old who鈥檚 in first grade. To me, that seems a bit absurd, but the truth is, I think there are certain contexts where a school-issued laptop is going to be surveilled. We know it鈥檚 going to be surveilled, but we don’t expect that it will be able to take pictures in our kids鈥 bedrooms. 

To me it’s a matter of where are [the] spaces where we should reasonably expect privacy? Transparency is the most important aspect of all of this. Not only were there no conversations going on like, 鈥淗ey look, these laptops are going to be surveilled in a number of ways. You should not be leaving them open in your bedroom, You should not be going on any website you wouldn鈥檛 want your principal to also see.鈥 The IT department specifically thought it would be a bad idea if parents and students were alerted to the existence of the software that could take images. They felt like, 鈥淲ell then we won鈥檛 be able to recover the stolen laptop because people will just put tape over it.鈥

Well, that is their decision not to have images taken of them in their bedroom, right? One of the journalists we interviewed said it was like trying to kill a fly with a bazooka. This level of surveillance was not required to track inventory. It just wasn鈥檛. 

Hindsight is 20/20 but it鈥檚 obvious from what transpired that they spent a lot more money on legal fees and settling these lawsuits than they ever saved by making sure a handful of laptops were not stolen or lost.

What did you learn about the motives of the school district officials, the lawyers and the families involved?

When I鈥檓 making a documentary I鈥檓 never thinking in terms of quote-unquote good guys and bad guys. Everyone in this story thought they were doing what was best for the students involved. But in the end, I think there was this balance of protecting students 鈥 privacy and protecting the image of the school district. When a mistake is made, there is a reluctance to admit and take responsibility and accept blame. Once you do that, you are admitting to what happened and then there鈥檚 all these legal ramifications. 

Multiple people are like, you know, these kids need therapists, they need somebody to check on them and to be like, 鈥淗ey, your privacy was violated, are you doing OK?鈥 and that did not happen.

I can鈥檛 say why that didn鈥檛 happen but to me it seems likely that part of not offering people help is that the minute you say this person needs a therapist because of what we did, you鈥檙e admitting to a pretty major violation. 

The documentary doesn鈥檛 focus just on the Robbins case. It offers a deep dive into education policy debates around racial inequities, school integration, gender equality and LGBTQ+ rights. What did you find were the implications of surveillance for these populations? 

We talked to Elizabeth Laird at the Center for Democracy and Technology and one of the things she said she sees all the time is that when surveillance is ubiquitous and regularly used in education, vulnerable populations end up feeling the brunt of the negative repercussions. 

In this case, back in 2010, people discovered that a disproportionate amount of the students that were surveilled were African American. There was a sense that if this technology was being misused to discipline students or to check up on students then the chances are it was going to be misused for somebody that was a student of color. 

When we started talking to students of color who had their images taken, we started to understand, 鈥淥h, there is this whole context to what they鈥檙e experiencing.鈥 Somebody said you can鈥檛 understand the laptop issue without understanding all these other battles that were happening at the time. There was a history of an achievement gap there and African-American parents felt like if you wanted to get an equal education for your kids, you had to fight for it. In this context, there was a real lack of trust of the school district by African-American parents. 

Keron Williams and his mother really wanted to tell his story. It was a story of somebody suspecting him of stealing a bracelet and him being brought into the principal’s office. He says his laptop webcam was activated a couple days later after they searched his pockets and found nothing but a Boy Scouts handkerchief. 

There鈥檚 racial profiling but also this idea of the misuse of technology meant to keep laptops from being stolen. If something like this is misused, vulnerable populations are going to feel the brunt of it more. 

That brings me to one of the other stories we talked about, which was more recent. 

In 2020, with the pandemic, school-issued devices and remote learning became the norm. We talked to two students who started high school online, went to classes on Zoom, and they were using their school-issued laptops for everything. 

The way they communicated instead of seeing their friends at lunch was through a Google Hangouts chat. What they didn’t realize was their school was using monitoring software that essentially scooped up everything they wrote while logged into their school account, including private chats. They were brought to the principal鈥檚 office and were confronted with what they wrote. 

The context of it is that the school decided it was bullying. What we reveal is that they were using the word 鈥済ay鈥 because they were. The term they used was 鈥渨e鈥檙e a pretty gay friend group. Gay was a descriptor to us.鈥

One of these kids had to come out in the principal鈥檚 office with his father there. Luckily his parents were pretty great about it, but that鈥檚 a really awful position to put a kid in and, you know, again, a vulnerable population bearing the brunt of overzealous surveillance. 

The goal of this surveillance is to protect kids, it鈥檚 to make sure kids aren鈥檛 hurting themselves, hurting other students. There鈥檚 obviously a mental health crisis going on in terms of high school-aged kids, but there really has to be a discussion about whether these tactics are making the mental health crisis better or worse. 

You鈥檙e talking about the tools that schools nationally have increasingly used to collect and analyze reams of information about students in the name of keeping them safe. This includes tools like Gaggle and GoGuardian. Given the growth in these tools, do any guardrails need to be put in place? 

First of all, it’s so important that students know what is being used to surveil any device they’re using. The fact that kids hadn’t heard of Gaggle is really a problem. 

But if they know about it, that doesn鈥檛 solve all the problems because what you鈥檙e asking high schoolers especially to do is to find their own voice, understand how to freely express themselves, to be vulnerable. In some of my best creative writing courses my teachers were saying, 鈥淟ook, if it scares you to write this, you鈥檙e probably going in the right direction.鈥 

The minute a kid realizes, 鈥淲ell, everything that I鈥檓 writing in a creative writing class 鈥 a poem, a personal essay 鈥 is going through this software, maybe going to my principal, maybe going to law enforcement,鈥 they鈥檙e going to express themselves differently. That鈥檚 just a really dangerous road to go down.

Students and parents have to be aware, but also I just think it should be less powerful. I don’t think we should be able to say there are no ways in which you can use our technology, which is kind of unavoidable if you’re a high school student, without being constantly surveilled.

In Minnesota, the story we cover, they . That鈥檚 a pretty huge step, and I think that鈥檒l happen more and more as people become more aware of this stuff. 

 There are just places where we should not be allowing this.

]]>
Ohio School Districts Use Surveillance Software to Monitor Student Devices /article/ohio-school-districts-use-surveillance-software-to-monitor-student-devices/ Wed, 28 Aug 2024 12:30:00 +0000 /?post_type=article&p=732161 This story mentions suicide. If you or someone you know needs support now call, text or chat the .

Ohio鈥檚 largest school district recently started using surveillance software on students鈥 devices.

Columbus City Schools partnered with Gaggle 鈥 a Texas-based student safety technology company that provides constant surveillance 鈥 at the end of last school year, district spokesperson Jacqueline Bryant said in an email.

鈥淭his is an added layer of security to ensure students are not visiting unapproved sites,鈥 she said in an email. 鈥淕aggle employs advanced technology and human insight to review students鈥 use of online tools 24/7/365 days a week and provides real-time analysis, swiftly flagging any potentially concerning behavior or content; this includes signs of self-harm, depression, substance abuse, cyberbullying, or other harmful situations.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Gaggle is currently partnered with about 1,500 school districts across the country, but would not answer how many of those districts are in Ohio, Gaggle spokesperson Shelby Goldman said.

鈥淲e have a practice to not answer questions about specific school districts,鈥 she said in an email.

Ohio鈥檚 three largest school districts 鈥 Columbus, and 鈥 use Gaggle. Cleveland did not answer questions the Ohio Capital Journal sent about Gaggle.

Cincinnati Public Schools started using Gaggle in 2013 and it is active for all grades, according to the district. It costs the district $323,780 to use Gaggle.

鈥淐incinnati Public Schools prioritizes the safety and well-being of its students and staff, and utilizes Gaggle to monitor threats for individual student safety and the safety of each school community,鈥 according to the district. 鈥淭he District monitors content on District-provided devices and applications based on specific language and phrases, generating trigger alerts for review, rather than continuous monitoring.鈥

Gaggle, which started in the 1990s, monitors school platforms such as Google Workspaces and Microsoft Office 365, but does not look at students鈥 personal email addresses or private social media accounts.

鈥淕aggle is an early warning system that identifies children in crisis so that schools can intervene before a tragedy happens,鈥 Goldman said in an email. 鈥淕aggle partners with school districts to help them monitor student activity on the technology (devices and accounts) provided by the school district.鈥

The company estimates it helped save , according to their report from last fall.

鈥淲e believe finding the right balance between monitoring for safety purposes and protecting student privacy and confidentiality is important, and we鈥檙e committed to continuing to support districts in achieving both,鈥 Goldman said in an email.

Gaggle uses Artificial Intelligence technology to spot things that could be an issue and a Human Safety Team reviews them before contacting the school.

鈥淥ur reviewers are looking at the context to determine if an item is related to an actual concern or maybe a simple reference to something that is harmless when in context,鈥 Goldman said in an email.

Gaggle can flag things as early warning signs or an imminent threat, which is treated with a higher level of urgency. It altered Ohio school districts to 1,275 student incidents that required immediate intervention in 2021, according to an .

Columbus City Schools, which has about 47,000 students, is implementing Gaggle in middle and high schools. Students can鈥檛 opt out of it.

The district signed two contracts with Gaggle 鈥 the first for $58,492.40 in January and $99,180 in June, according to school board documents.

During the district鈥檚 Gaggle pilot from April 2022 to December 2023, 3,942 pieces of content were looked at by Gaggle鈥檚 Safety Team which led to 226 鈥渁ctionable student safety concerns that were sent to Emergency Contacts,鈥 according to a school board document.

Even though Sharon Kim鈥檚 two students are in elementary school and won鈥檛 yet be affected by the district鈥檚 Gaggle implementation, she is concerned about the district using surveillance technology.

鈥淪chool should be a safe place for our kids,鈥 Kim said. 鈥淭hey spend so much time in their lives at school, it should be a place where they feel safe, not where they feel like they鈥檙e being monitored and surveilled every single minute of the day. I really feel that this kind of surveillance is a huge hindrance to that.鈥

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Ohio Capital Journal maintains editorial independence. Contact Editor David Dewitt for questions: info@ohiocapitaljournal.com. Follow Ohio Capital Journal on and .

]]>
Room Scans & Eye Detectors: Robocops are Watching Your Kids Take Online Exams /article/room-scans-eye-detectors-robocops-are-watching-your-kids-take-online-exams/ Thu, 18 Apr 2024 10:15:00 +0000 /?post_type=article&p=725432

Remote proctoring tools like Proctorio have faced widespread pushback at colleges. Less scrutiny and awareness exists on their use in K-12 schools.

Updated, correction appended April 18

In the middle of night, students at Utah鈥檚 Kings Peak High School are wide awake 鈥 taking mandatory exams. 

At this online-only school, which opened during the pandemic and has ever since, students take tests from their homes at times that work best with their schedules. Principal Ammon Wiemers says it鈥檚 this flexibility that attracts students 鈥 including athletes and teens with part-time jobs 鈥 from across the state. 

鈥淪tudents have 24/7 access but that doesn鈥檛 mean the teachers are going to be there 24/7,鈥 Wiemers told 社区黑料 with a chuckle. 鈥淪ometimes [students] expect that but no, our teachers work a traditional 8 to 4 schedule.鈥 

Any student who feels compelled to cheat while their teacher is sound asleep, however, should know they鈥檙e still being watched. 

For students, the cost of round-the-clock convenience is their privacy. During exams, their every movement is captured on their computer鈥檚 webcam and scrutinized by Proctorio, . Proctorio software conducts 鈥渄esk scans鈥 in a bid to catch test-takers who turn to 鈥渦nauthorized resources,鈥 鈥渇ace detection鈥 technology to ensure there isn鈥檛 anybody else in the room to help and 鈥済aze detection鈥 to spot anybody 鈥渓ooking away from the screen for an extended period of time.鈥 

Proctorio then provides visual and audio records to Kings Peak teachers with the algorithm calling particular attention to pupils whose behaviors during the test flagged them as possibly engaging in academic dishonesty. 

Such remote proctoring tools grew exponentially during the pandemic, particularly at U.S. colleges and universities where administrators seeking to ensure exam integrity during remote learning met with sharp resistance from students. Online end the surveillance regime; the tools of and that set off a red flag when the tool failed to detect Black students’ faces.  

A video uploaded to TikTok offers advice on how to cheat during exams that are monitored by Proctorio. (Screenshot)

At the same time, social media platforms like TikTok were flooded with videos purportedly highlighting service vulnerabilities that taught others

K-12 schools鈥 use of remote proctoring tools, however, has largely gone under the radar. Nearly a year since the federal public health emergency expired and several since the vast majority of students returned to in-person learning, an analysis by 社区黑料 has revealed that K-12 schools nationwide 鈥 and online-only programs in particular 鈥 continue to use tools from digital proctoring companies on students, including those as young as kindergarten. 

Previously unreleased survey results from the nonprofit Center for Democracy and Technology found that remote proctoring in K-12 schools has become widespread. In its August 2023 36% of teachers reported that their school uses the surveillance software.

Civil rights activists, who contend AI proctoring tools fail to work as intended, harbor biases and run afoul of students鈥 constitutional protections, said the privacy and security concerns are particularly salient for young children and teens, who may not be fully aware of the monitoring or its implications. 

鈥淚t鈥檚 the same theme we always come back to with student surveillance: It鈥檚 not an effective tool for what it鈥檚 being claimed to be effective for,鈥 said Chad Marlow, senior policy counsel at the American Civil Liberties Union. 鈥淏ut it actually produces real harms for students.鈥 

It鈥檚 always strange in a virtual setting 鈥 it鈥檚 like you鈥檙e watching yourself take the test in the mirror.

Ammon Wiemers, Principal Kings Peak High School

Wiemers is aware that the school, where about 280 students are enrolled full time and another 1,500 take courses part time, must make a delicate 鈥渃ompromise between a valid testing environment and students鈥 privacy.鈥 When students are first subjected to the software he said 鈥渋t鈥檚 kind of weird to see that a camera is watching,鈥 but unlike the uproar at colleges, he said the monitoring has become 鈥渘ormalized鈥 among his students and that anybody with privacy concerns is allowed to take their tests in person.

鈥淚t鈥檚 always strange in a virtual setting 鈥 it鈥檚 like you鈥檙e watching yourself take the test in the mirror,鈥 he said. 鈥淏ut when students use it more, they get used to it.鈥  

Children 鈥榙on鈥檛 take tests鈥

Late last year, Proctorio founder and CEO Mike Olsen published   in response to research critical of the company鈥檚 efficacy. A tech-savvy Ohio college student had conducted an analysis and concluded Proctorio鈥檚 relied on an open-source software library with a 鈥 including a failure to recognize Black faces more than half of the time. 
The student tested the company鈥檚 face-detection capabilities against a dataset of nearly 11,000 images, , which depicted people of multiple races and ethnicities, with results showing a failure to distinguish Black faces 57% of the time, Middle Eastern faces 41% of the time and white faces 40% of the time. Such a high failure rate was problematic for Proctorio, which relies on its ability to flag cheaters by zeroing in on people鈥檚 facial features and movements. 

Olsen鈥檚 post sought to discredit the research, arguing that while the FairFace dataset had been used to identify biases in other facial-detection algorithms, the images weren鈥檛 representative of 鈥渁 live test-taker鈥檚 remote exam experience.鈥 

鈥淔or example,鈥 he wrote, 鈥渃hildren and cartoons don鈥檛 take tests so including those images as part of the data set is unrealistic and unrepresentative.鈥 

Proctorio founder and CEO Mike Olsen published a blog post that countered research claiming the remote proctoring tool had a high fail rate 鈥 especially for Black students. (Screenshot)

To Ian Linkletter, a librarian from Canada embroiled in a long-running battle with Proctorio over whether its products were harmful, Olsen鈥檚 response was baffling. Sure, cartoon characters don鈥檛 take tests. But children, he said, certainly do. What he wasn鈥檛 sure about, however, was whether those younger test-takers were being monitored by Proctorio 鈥 so he set out to find out. 

He found two instances, both in Texas, where Proctorio was being used in the K-12 setting, including at a remote school tied to the University of Texas at Austin. Linkletter shared his findings with 社区黑料, which used the government procurement tool GovSpend to identify other districts that have contracts with Proctorio and its competitors. 

More than 100 K-12 school districts have relied on Proctorio and its competitors, according to the GovSpend data, with a majority of expenditures made during the height of the pandemic. And while remote learning has become a more integral part of K-12 schooling nationwide, seven districts have paid for remote proctoring services in the last year. While extensive, the GovSpend database doesn鈥檛 provide a complete snapshot of U.S. school districts or their expenditures. 

鈥淚t was just obvious that Proctorio had K-12 clients and were being misleading about children under 18 using their product,鈥 Linkletter said, adding that young people could be more susceptible to the potential harms of persistent surveillance. 鈥淚t鈥檚 almost like a human rights issue when you鈥檙e imposing it on students, especially on K-12 students.鈥 Young children, he argued, are unable to truly consent to being monitored by the software and may not fully understand its potential ramifications. 

Proctorio did not respond to multiple requests for comment by 社区黑料. Founded in 2013, claims it provided remote proctoring services during the height of the pandemic to education institutions globally. 

In 2020,  over a series of tweets in which the then-University of British Columbia learning technology specialist linked to Proctorio-produced YouTube videos, which the company had made available to instructors. Using the video on the tool’s “Abnormal Eye Movement function,” Linkletter that it showed “the emotional harm you are doing to students by using this technology.”

Proctorio鈥檚 lawsuit alleged that Linkletter鈥檚 use of the company鈥檚 videos, which were unlisted and could only be viewed by those with the link, amounted to copyright infringement and distributing of confidential material. In January, Canada’s Supreme Court Linkletter’s claim that the litigation was specifically designed to silence him.

While there is little independent research on the efficacy of any remote proctoring tools in preventing cheating, one 2021 study found that who had been instructed to cheat. Researchers concluded the software is 鈥渂est compared to taking a placebo: It has some positive influence, not because it works but because people believe that it works, or that it might work.鈥 

Remote proctoring costs K-12 schools millions

A , the online K-12 school operated by the University of Texas, indicates that Proctorio is used for Credit by Exam tests, which award course credit to students who can demonstrate mastery in a particular subject. For students in kindergarten, first and second grade, the district pairs district proctoring with a 鈥淧roctorio Secure Browser,鈥 which prohibits test takers from leaving the online exam to use other websites or programs. Beginning in third grade, according to the rubric uploaded to the school鈥檚 website, test takers are required to use Proctorio鈥檚 remote online proctoring.

A UT High School rubric explains how it uses Proctorio software. (Screenshot)

Proctorio isn鈥檛 the only remote proctoring tool in use in K-12 schools. GovSpend data indicate the school district in Las Vegas, Nevada, has spent more than $1.4 million since 2018 on contracts with Proctorio competitor Spending on Honorlock by the Clark County School District surged during the pandemic but as recently as October, it had a $286,000 company purchase. GovSpend records indicate the tool is used at , the district鈥檚 online-only program which claims more than 4,500 elementary, middle and high school students. Clark County school officials didn鈥檛 respond to questions about how Honorlock is being utilized. 

Meanwhile, dozens of K-12 school districts relied on the remote proctoring service ProctorU, now known as , during the pandemic, records indicate, with several maintaining contracts after school closures subsided. Among them is the rural Watertown School District in South Dakota, which spent $18,000 on the service last fall. 

Aside from Wiemers, representatives for schools mentioned in this story didn鈥檛 respond to interview requests or declined to comment. Meazure Learning and Honorlock didn鈥檛 respond to media inquiries. 

At TTU K-12, an online education program offered by Texas Tech University, the institution relies on Proctorio for 鈥渁ll online courses and Credit by Examinations,鈥 flagging suspicious activity to teachers for review. In an apparent nod to Proctorio privacy concerns, TTU instructs students to select private spaces for exams and that if they are testing in a private home, they have to get the permission of anyone also residing there for the test to be recorded. 

Documents indicate that K-12 institutions continue to subject remote learners to room scans even after a federal judge ruled a university鈥檚 . In 2022, a federal judge sided with a Cleveland State University student, who alleged that a room scan taken before an online exam at the Ohio institution violated his Fourth Amendment rights against unreasonable searches and seizures. The judge ruled that the scan was 鈥渦nreasonable,鈥 adding that 鈥渞oom scans go where people otherwise would not, at least not without a warrant or an invitation.鈥 

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

Marlow of the ACLU says he finds room scans particularly troubling 鈥 especially in the K-12 context. From an equity perspective, he said such scans could have disproportionately negative effects on undocumented students, those living with undocumented family members and students living in poverty. He expressed concerns that information collected during room scans could be used as evidence for immigration enforcement 

鈥淭here are two fairly important groups of vulnerable students, undocumented families and poor students, who may not feel that they can participate in these classes because they either think it’s legally dangerous or they’re embarrassed to use the software,鈥 he said. 

The TTU web page notes that students 鈥渕ay be randomly asked to perform a room scan,鈥 where they鈥檙e instructed to offer their webcam a 360-degree view of the exam environment with a warning: Failure to perform proper scans could result in a violation of exam procedures.

鈥淚f you鈥檙e using a desktop computer with a built-in webcam, it might be difficult to lift and rotate the entire computer,鈥 the web page notes while offering a solution. 鈥淵ou can either rotate a mirror in front of the webcam or ask your instructor for further instruction.鈥

鈥楢 legitimate concern鈥 

Wiemers, the principal in Utah, said that Proctorio serves as a deterrent against cheating 鈥 but is far from foolproof. 

鈥淭here鈥檚 ways to cheat any software,鈥 he said, adding that educators should avoid the urge to respond to Proctorio alerts with swift discipline. In the instances where Proctorio has caught students cheating, he said that instead of being given a failing grade, they鈥檙e simply asked to retake the test. 

鈥淭here are limitations to the software, we have to admit that, it鈥檚 not perfect, not even close,鈥 he said. 鈥淏ut if we expect it to be, and the stakes are high and we鈥檙e overly punitive, I would say [students] have a legitimate concern.鈥

During a TTU K-12 advisory board meeting in July 2021, administrators outlined the extent that Proctorio is used during exams. Justin Louder, who at the time served as the TTU K-12 interim superintendent, noted that teachers and a 鈥渉andful of administrators within my office鈥 had access to view the recordings. Ensuring that third parties didn鈥檛 have access to the video feeds was 鈥渁 big deal for us,鈥 he said, because they鈥檙e 鈥渄ealing with minors.鈥 

While college students 鈥渞eally kind of pushed back鈥 on remote proctoring, he noted that they only received a few complaints from K-12 parents, who recognized the service offered scheduling benefits. Like Wiemers, he framed the issue as one of 24-hour convenience. 

鈥淚t lets students go at their own pace,鈥 he said. 鈥淚f they鈥檙e ready at 2 o鈥檆lock in the morning, they can test at 2 o鈥檆lock in the morning.鈥

Correction: A copyright infringement case brought by Proctorio against longtime company critic Ian Linkletter is still being argued in court. An earlier version of this story mischaracterized the litigation as being ruled in Proctorio’s favor.

]]>
Exclusive: Dems Urge Federal Action on Student Surveillance Citing Bias Fears /article/exclusive-dems-urge-federal-action-on-student-surveillance-citing-discrimination-fears/ Thu, 19 Oct 2023 18:01:00 +0000 /?post_type=article&p=716619 A coalition of Democratic lawmakers on Thursday called on the U.S. Education Department to investigate school districts that use digital surveillance and other artificial intelligence tools in ways that trample students鈥 civil rights. 

, the coalition expressed concerns that AI-enabled student monitoring tools could foster discrimination against marginalized groups, including LGBTQ+ youth and students with disabilities. The Education Department鈥檚 Office for Civil Rights should issue guidance on the appropriate uses of emerging classroom technologies, the lawmakers wrote, and crack down on practices that run afoul of existing federal anti-discrimination laws. 

鈥淲hile the expansion of educational technology helped facilitate remote learning that was critical to students, parents and teachers during the pandemic,鈥 the lawmakers wrote, 鈥渢hese technologies have also amplified student harms.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Lawmakers asked the Education Department鈥檚 civil rights office whether it has received complaints alleging discrimination facilitated by education technology software and whether it has taken any enforcement action related to potential civil rights violations. 

The letter comes in response to a recent national survey of educators, parents and students, the findings of which suggest that schools鈥 use of digital tools to monitor children online have based on their race, disability, sexual orientation and gender identity. The survey, conducted by the nonprofit Center for Democracy and Technology, found that while activity monitoring has become ubiquitous in schools and is intended to keep students safe, it鈥檚 used regularly as a discipline tool and routinely brings youth into contact with the police.

Findings from the CDT survey, lawmakers wrote, 鈥渞aise serious concerns about the application of civil rights laws to schools鈥 use of these technologies.鈥 Letter signatories include Democratic Reps. Lori Trahan of Massachusetts, Sara Jacobs of California, Hank Johnson of Georgia, Bonnie Watson Coleman of New Jersey and Adam Schiff of California. Trahan, who serves on the House Energy and Commerce Committee鈥檚 Innovation, Data and Commerce Subcommittee, has previously called for tighter student data privacy protections in the ed tech sector. 

The monitoring tools, such as those offered by for-profit companies GoGuardian and Gaggle, rely on artificial intelligence to sift through students鈥 online activities and flag school administrators 鈥 and sometimes the police 鈥 when they discover materials related to sex, drugs, violence or self-harm. 

Two-thirds of teachers reported that a student at their school was disciplined as a result of activity monitoring and a third said they know a student who was contacted by the police because of an alert generated by the software. 

Children with disabilities were more likely than their peers to report being watched, and special education teachers reported heightened rates of discipline as a result of activity monitoring. The findings, researchers argue, that entitle children with disabilities equal access to an education. Even beyond the technologies, students with disabilities are subjected to disproportionate levels of school discipline, including restraint and seclusion, when compared to their general education peers. 

Half of all students said their schools responded fairly to alerts generated by monitoring software, a sentiment shared by just 36% of LGBTQ+ youth. In fact, LGBTQ+ youth were more likely than their straight and cisgender peers to report that they or someone they know was disciplined as a result of monitoring. And nearly a third of LGBTQ+ youth reported that they or someone they know was outed because of the technology. 

More than a third of teachers said their school monitors students鈥 online behaviors outside of school hours 鈥 and sometimes on their personal devices. 

In a similar student survey, released this month by the American Civil Liberties Union, a majority of respondents expressed worries that the monitoring tools 鈥 despite being designed to keep them safe 鈥 could actually cause harm and a third said they 鈥渁lways feel鈥 like they鈥檙e being watched. 

社区黑料 has reported extensively on schools鈥 use of digital surveillance tools to monitor students鈥 online behaviors, and the tools鈥 implications for youth civil rights. The company Gaggle previously flagged to administrators student communications that referenced LGBTQ+ keywords like 鈥済ay鈥 and 鈥渓esbian.鈥 The company says it halted the practice last year in the wake of pushback from civil rights activists. 

Given the survey findings, the lawmakers urged the Education Department to clarify 鈥渉ow educators can fulfill their civil rights obligations鈥 as they develop policies related to artificial intelligence, whose rapidly evolving role in education more broadly 鈥 including students鈥 use of tools like ChatGPT 鈥 has become a topic of debate. 

鈥淭his research is particularly concerning due to linkages between school disciplinary policies and incarceration rates of our nation鈥檚 youth,鈥 the coalition wrote, adding concerns that the tools can create hostile learning environments. 

]]>
White House Cautions Schools Against 鈥楥ontinuous Surveillance鈥 of Students /article/white-house-cautions-schools-against-continuous-surveillance-of-students/ Tue, 04 Oct 2022 21:38:35 +0000 /?post_type=article&p=697623 Updated, Oct. 5

The Biden administration on Tuesday urged school districts nationwide to refrain from subjecting students to 鈥渃ontinuous surveillance鈥 if the use of digital monitoring tools 鈥 already accused of targeting at-risk youth 鈥 are likely to trample students鈥 rights. 

The White House recommendation was included in an in-depth but non-binding white paper, dubbed the that seeks to rein in the potential harms of rapidly advancing artificial intelligence technologies, from smart speakers featuring voice assistants to campus surveillance cameras with facial recognition capabilities. 

The blueprint, which was released by the White House Office of Science and Technology Policy and extends far beyond the education sector, lays out five principles: Tools that rely on artificial intelligence should be safe and effective, avoid discrimination, ensure reasonable privacy protections, be transparent about their practices and offer the ability to opt out 鈥渋n favor of a human alternative.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Though the blueprint lacks enforcement, schools and education technology companies should expect greater federal scrutiny soon. In , the White House announced that the Education Department would release by early 2023 recommendations on schools鈥 use of artificial intelligence that 鈥渄efine specifications for the safety, fairness and efficacy of AI models used within education鈥 and introduce 鈥済uardrails that build on existing education data privacy regulations.鈥 

During , Education Secretary Miguel Cardona said officials at the department 鈥渆mbrace utilizing Ed Tech to enhance learning鈥 but recognize 鈥渢he need for us to change how we do business.鈥 The future guidance, he said, will focus on student data protections, ensuring that digital tools are free of biases and incorporate transparency so parents know how their children鈥檚 information is being used.

鈥淭his has to be baked into how we do business in education, starting with the systems that we have in our districts but also teacher preparation and teacher training as well,鈥 he said.

Amelia Vance, president and founder of Public Interest Privacy Consulting, said the document amounts to a 鈥渕assive step forward for the advocacy community, the scholars who have been working on AI and have been pressuring the government and companies to do better.鈥 

The blueprint, which offers a harsh critique of and systems that predict student success based on factors like poverty, follows in-depth reporting by 社区黑料 on schools鈥 growing use of digital surveillance and the tech鈥檚 impact on student privacy and civil rights.

But local school leaders should ultimately decide whether to use digital student monitoring tools, said Noelle Ellerson Ng, associate executive director of advocacy and governance at AASA, The School Superintendents Association. Ellerson Ng opposes 鈥渦nilateral federal action to prohibit鈥 the software.

鈥淭hat鈥檚 not the appropriate role of the federal government to come and say this cannot happen,鈥 she said. 鈥淏ut smart guardrails that allow for good practices, that protect students鈥 safety and privacy, that鈥檚 a more appropriate role.鈥

The nonprofit Center for Democracy and Technology praised the report. The group recently released a survey highlighting the potential harms of student activity monitoring on at-risk youth, who are already disproportionately disciplined and referred to the police as a result. In a statement Tuesday, it said the blueprint makes clear 鈥渢he ways in which algorithmic systems can deepen inequality.鈥 

鈥淲e commend the White House for considering the diverse ways in which discrimination can occur, for challenging inappropriate and irrelevant data uses and for lifting up examples of practical steps that companies and agencies can take to reduce harm,鈥 CEO Alexandra Reeve Givens said in a media release. 

The document also highlights several areas where artificial intelligence has been beneficial, including improved agricultural efficiency and algorithms that have been used to identify diseases. But the technologies, which have grown rapidly with few regulations, have introduced significant harm, it notes, including that screen job applicants and facial recognition technology that . 

After the pandemic shuttered schools nationwide in early 2020 and pushed students into makeshift remote learning, companies that sell digital activity monitoring software to schools saw an increase in business. But the tools have faced significant backlash for subjecting students to relentless digital surveillance. 

In April, Massachusetts Sens. Elizabeth Warren and Ed Markey warned in a report the technology could carry significant risks 鈥 particularly for students of color and LGBTQ youth 鈥 and promoted a 鈥渘eed for federal action to protect students鈥 civil rights, safety and privacy.鈥 Such concerns have become particularly acute as states implement new anti-LGBTQ laws and abortion bans and advocates warn that digital surveillance tools could expose expose youth to legal peril. 

Vance said that she and others focused on education and privacy 鈥渉ad no idea this was coming,鈥 and that it would focus so heavily on schools. Over the last year, the department sought input from civil rights groups and technology companies, but Vance said that education groups had lacked a meaningful seat at the table. 

The lack of engagement was apparent, she said, by the document鈥檚 failure to highlight areas where artificial intelligence has been beneficial to students and schools. For example, the document discusses a tool used by universities to predict which students were likely to drop out. It considered students鈥 race as a predictive factor, leading to discrimination fears. But she noted that if implemented equitably, such tools can be used to improve student outcomes. 

鈥淥f course there are a lot of privacy and equity and ethical landmines in this area,鈥 Vance said. 鈥淏ut we also have schools who have done this right, who have done a great job in using some of these systems to assist humans in counseling students and helping more students graduate.鈥 

Ellerson Ng, of the superintendents association, said her group is still analyzing the blueprint鈥檚 on-the-ground implications, but that student data privacy efforts present schools with 鈥渁 balancing act.鈥

鈥淵ou want to absolutely secure the privacy rights of the child while understanding that the data that can be generated, or is generated, has a role to play, too, in helping us understand where kids are, what kids are doing, how a program is or isn鈥檛 working,鈥 she said. 鈥淪ometimes that鈥檚 broader than just a pure academic indicator.鈥

Others have and just of recommendations from civil rights groups and tech companies. Some of the most outspoken privacy proponents and digital surveillance critics, such as Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, argued it falls short of a critical policy move: outright bans.

As Cahn and other activists mount campaigns against student surveillance tools, they鈥檝e highlighted how student data can wind up in the hands of the police.

鈥淲hen police and companies are rolling out new and destructive forms of AI every day, we need to push pause across the board on the most invasive technologies,鈥 he said in a media release. 鈥淲hile the White House does take aim at some of the worst offenders, they do far too little to address the everyday threats of AI, particularly in police hands.鈥

]]>
Trevor Project Severs Ties with Surveillance Company Accused of LGBTQ Youth Bias /article/trevor-project-teams-upith-student-surveillance-company-accused-of-lgbtq-bias/ Fri, 30 Sep 2022 11:00:00 +0000 /?post_type=article&p=697341 Updated 3:15 p.m. ET

Hours after the publication of this article Friday, The Trevor Project announced in a tweet it would return a $25,000 donation from the student surveillance company Gaggle, acknowledging widespread concerns about the monitoring tool鈥檚 鈥渞ole in negatively impacting LGBTQ students.鈥

鈥淥ur philosophy is that having a seat at the table enables us to positively influence how companies engage with LGBTQ young people, and we initially agreed to work with Gaggle because we saw an opportunity to have a meaningful impact to better protect LGBTQ students,鈥 the nonprofit said in the statement. 鈥淲e hear and understand the concerns, and we hope to work alongside schools and institutions to ensure they are appropriately supporting LGBTQ youth and their mental health.鈥 

The move came after widespread condemnation on social media, with multiple supporters threatening to pull their donations to The Trevor Project moving forward. 

In a Friday statement, Gaggle spokesperson Paget Hetherington said the company wanted The Trevor Project鈥檚 鈥済uidance on how to do what we do better.鈥 The company also where it previously touted the partnership. 

鈥淲e鈥檙e disappointed that The Trevor Project has decided to pause our collaboration,鈥 she said. 鈥淗owever, we are grateful for the opportunity we have had to learn and work with them and will continue with our mission of protecting all students regardless of how they identify.鈥 

Original report below:

Amid warnings from lawmakers and civil rights groups that digital surveillance tools could discriminate against at-risk students, a leading nonprofit devoted to the mental well-being of LGBTQ youth has formed a financial partnership with a tech company that subjects them to persistent online monitoring. 

, The Trevor Project, a high-profile nonprofit focused on suicide prevention among LGBTQ youth, began to list Gaggle as on its website, disclosing that the controversial surveillance company had given them between $25,000 and $50,000 in support. Meanwhile Gaggle, which uses artificial intelligence and human content moderators to sift through billions of student chat messages and homework assignments each year in search of students who may harm themselves or others, noting the two were collaborating to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

Though the precise contours of the partnership remain unclear, a Trevor Project spokesperson said it aims to have a positive influence on the way Gaggle navigates privacy concerns involving LGBTQ youth while a Gaggle representative said the company sees the relationship as a learning opportunity.

Both groups maintain that the partnership was forged in the interests of LGBTQ students, but student privacy advocates argue the relationship could undermine The Trevor Project鈥檚 work while allowing Gaggle to use the donation to counter criticism about its potential harms to LGBTQ students. The collaboration comes at a particularly perilous time for many students as a rash of states implement new anti-LGBTQ laws that could erode their privacy and expose them to legal jeopardy. 

Teeth Logsdon-Wallace, a 14-year-old student from Minneapolis with first-hand experience of Gaggle鈥檚 surveillance dragnet, said the deal could eliminate any motivation for Gaggle to change its business practices. 

鈥淚t really does feel like a 鈥榃e paid you, now say we鈥檙e fine,鈥 kind of thing,鈥 said Logsdon-Wallace, who is transgender. Without any real incentives to implement reforms, he said that Gaggle鈥檚 鈥渟eal of approval鈥 from The Trevor Project could offer the privately held company reputational cover amid growing concerns that such surveillance tech is disproportionately harmful to LGBTQ youth. 

鈥淧eople who want to defend Gaggle can just point to their little Trevor Project thing and say, 鈥楽ee, they have the support of 鈥淭he Gays鈥 so it鈥檚 fine actually,鈥 and all it does is make it easier to deflect and defend actual issues with Gaggle.鈥 

A screenshot showing that Gaggle is a corporate partner of The Trevor Project
Student surveillance company Gaggle is listed among 鈥淐orporate Partners鈥 on The Trevor Project鈥檚 website (screenshot)

Following an investigation by 社区黑料 into Gaggle鈥檚 monitoring practices, the company . Gaggle鈥檚 algorithm relies on keyword matching to compare students鈥 online communications against a dictionary of thousands of words the company believes could indicate potential trouble, including references to violence, drugs and sex. Among the keywords are 鈥済ay鈥 and 鈥渓esbian,鈥 verbiage the company maintains is necessary because LGBTQ youth are more likely than their straight and cisgender peers to consider suicide. 

But privacy and civil rights advocates have accused the company of discrimination by subjecting LGBTQ youth to heightened surveillance 鈥 a concern that has taken on new meaning this year as states like Florida adopt laws that ban classroom discussions about sexuality and LGBTQ youth to their parents.  

A by the nonprofit Center for Democracy and Technology found that while Gaggle and similar student monitoring tools are designed to keep students safe, teachers reported that they were more often used to discipline them. LGBTQ youth were disproportionately affected. 

In a statement, a Trevor Project spokesperson said it鈥檚 important that digital monitoring tools keep students safe without invading their privacy and that the collaboration was built on Gaggle鈥檚 鈥渄esire to identify and address privacy and safety concerns that their product could cause for LGBTQ students.鈥 

鈥淚t鈥檚 true that LGBTQ youth are among the most vulnerable to the misuse of this kind of safety monitoring 鈥 many worry that these tools could out them to teachers or parents against their will,鈥 the statement continued. 鈥淚t is because of that very real concern that we have worked in a limited capacity with digital safety companies 鈥 to play an educational role and have a seat at the table so they can consider these potential risks while they design their products and develop policies.鈥 

But it remains unclear what policy changes have occurred at Gaggle as a result of the deal. Without offering any specifics, Gaggle spokesperson Paget Hetherington said in a statement the company is 鈥渉onored to be able to align with The Trevor Project to better serve LGBTQ youth,鈥 and that the company is 鈥渁lways looking for ways to learn and to improve upon what we do to better support students and keep them safe.鈥 

鈥楩aceless bureaucracy鈥 

At its core, the partnership between Gaggle and The Trevor Project makes sense because both work to prevent youth suicides, said Amelia Vance, the founder and president of . But their approaches to solving the problem, she said, are fundamentally different. 

By combing through digital materials on students鈥 school-issued Microsoft and Google accounts, Gaggle seeks to alert educators 鈥 and in some cases the police 鈥 of students’ online behaviors that suggest they might harm themselves or others.

鈥淚t really is about collecting details that kids may not be voluntarily sharing 鈥 information that they may be looking up to learn, to explore their identities, to otherwise help them in their day-to-day lives,鈥 Vance said. At The Trevor Project, 鈥測ou have proactive outreach from youth who know that they need help or they need a community.鈥 

Katy Perry smiles in front of a Trevor Project background, holding a poster that says "Be proud of who you are."
Katy Perry poses for a photograph during a fundraising event for The Trevor Project in 2012. (Mark Davis/Getty Images for Trevor Project)

The West Hollywood-based Trevor Project, which and funding from including Macy鈥檚 and AT&T, was founded in 1998 and in contributions in 2020. Gaggle, founded in 1999, does not publicly report its finances. The Dallas-based company says it monitors the digital communications of more than 5 million students across more than 1,500 school districts nationally. 

The Trevor Project to train volunteer crisis counselors and assess the risk levels of people who reach out to for help. If counselors with The Trevor Project believe a student is at imminent suicide risk, to call the police. But it鈥檚 ultimately up to youth to decide which information they share with adults. 

It鈥檚 important for LGBTQ students to have trusting adults with whom they can confide their experiences, Vance said, rather than a system where 鈥渟ome faceless bureaucracy is finding out and informing your parents鈥 about information they intended to keep private. 

A by The Trevor Project offers troubling data about the realities of the youth suicide crisis. Nearly half of LGBTQ youth said they seriously considered attempting suicide in the past year and 14% said they made a suicide attempt. 

This isn鈥檛 the first time The Trevor Project has faced scrutiny in recent months for its ties to companies that could have detrimental effects on LGBTQ youth. In July, a HuffPost investigation revealed that CEO and Executive Director Amit Paley previously and helped create a strategic plan to boost opioid sales amid an addiction epidemic 鈥 one that鈥檚 in suicide attempts among LGBTQ youth. 

The group knows firsthand how data can be weaponized. Just last month, that target the transgender community launched a campaign to clog up The Trevor Project鈥檚 suicide prevention hotline. 

Persistent student surveillance could exacerbate the challenges that LGBTQ youth face by subjecting them to disproportionate discipline and erroneously flagging their online communications as threats, Democratic Sens. Elizabeth Warren and Ed Markey warned in an April report

Nearly a third of LGBTQ students say they or someone they know has experienced the nonconsensual disclosure of their sexual orientation or gender identity 鈥 typically called 鈥渙uting鈥 鈥 due to student activity monitoring, by the nonprofit Center for Democracy and Technology. They were also more likely than their straight and cisgender peers to report getting into trouble at school and being contacted by the police about having committed a crime. 

A bar chart showing LGBTQ+ students are more likely to get in trouble for visiting a website or saying something inappropriate online; were more likely to be contacted by counselors or other adults at school about their mental health; and were more likely to be contacted by a police officer or other adult due to concerns about them committing a crime.
A recent survey by the nonprofit Center for Democracy and Technology found that student monitoring tools have disproportionate negative effects on LGBTQ youth. (Center for Democracy and Technology) 

In response to the survey results, a coalition of civil rights groups called on the U.S. Education Department to condemn the use of activity monitoring tools that violate students鈥 civil liberties and to state its intent 鈥渢o take enforcement action against violations that result in discrimination.鈥 The letter argues that using the tools to out LGBTQ students or to subject them to disproportionate discipline and criminal investigations could violate Title IX, the federal law prohibiting sex-based discrimination in schools. 

Among the letter signatories is the nonprofit LGBT Tech, which about the harms of digital surveillance on LGBTQ people. Christopher Wood, the group鈥檚 co-founder and executive director, said The Trevor Project鈥檚 partnership with Gaggle could be positive if it鈥檚 used to ensure that LGBTQ youth who are struggling have access to help. But once Gaggle gives student information to school administrators, the company can no longer control how those records are used, he said. 

A screenshot from Gaggle's website. Gray box with text that says Gaggle is a Proud Sponsor of The Trevor Project.
Gaggle says on its website that the student surveillance company 鈥渋s proud to collaborate with The Trevor Project and improve mental health outcomes for LGBTQ young people.鈥 (Screenshot)

鈥淚f that information is provided to someone who is not accepting, who has very different views and who willfully brings their political, personal or religious views into the school system, and they are not supportive of LGBTQ youth, then what they鈥檝e done is harm the student,鈥 Wood said. 

Yet as schools increasingly turned to student activity monitoring software during the pandemic, The Trevor Project portrayed their growth as an inevitable result of districts seeking 鈥渢o avoid liability issues.鈥  

鈥淚t is our stance that since these tools are not going anywhere, we think it鈥檚 important to do our part to offer our expertise around LGBTQ experiences,鈥 the spokesperson said. 

A student holds up a peace sign with one hand and has the other wrapped around his dog
Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

The power of trust

In interviews, students flagged by Gaggle said their trust in adults suffered as a result. Among them is Logsdon-Wallace, the 14-year-old transgender student. Before the Minneapolis school district stopped using Gaggle this summer and state lawmakers put strict limits on digital surveillance in schools, the tool alerted district security when he used a classroom assignment to reflect on a previous suicide attempt and how music therapy helped him cope. That same assignment, which included references to his gender identity, was flagged to his parents. 

And while his parents are affirming, he has friends who live in less supportive environments.                                                                                                       

鈥淚 have friends who are queer and/or trans who are out at school but not to their parents,鈥 he said. 鈥淚f they want to be open with teachers, Gaggle can create a bad or even dangerous situation for these kids if their parents were contacted about what they were saying.鈥 

In The Trevor Project鈥檚 recent survey, nearly three-quarters of LGBTQ youth reported that they have endured discrimination based on their sexual orientation or gender identity, just 37% said their homes are affirming and 55% said the same about their schools. 

Given that reality, reported sharing information about their sexual orientation with teachers or guidance counselors. 

While Gaggle has maintained that keywords like 鈥済ay鈥 and 鈥渓esbian鈥 can also prevent bullying, Logsdon-Wallace said their approach is out of touch with how students generally interact. At school, he said he鈥檚 been called just about every 鈥渟lur for a queer or a trans person that isn鈥檛 from like 80 years ago.鈥 While slurs are common, terms like 鈥渓esbian鈥 are not.

鈥淎s an actual teenager going to an actual public school, those words are not being used to bully people,鈥 he said. 鈥淭hey鈥檙e just not.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
Survey Reveals Extent that Cops Surveil Students Online 鈥 in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids鈥 private lives 鈥 including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers 鈥 89% 鈥 reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half 鈥 44% 鈥 said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students鈥 social media posts, follow their digital movements in real-time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

鈥淚f we鈥檙e saying this is to keep students safe, but instead we鈥檙e using it punitively and we鈥檙e using it to invite law enforcement literally into kids鈥 homes, is this actually achieving its intended goal?鈥 asked Elizabeth Laird, a survey author and the center鈥檚 director of equity in civic technology. 鈥淥r are we, in the name of keeping students safe, actually endangering them?鈥

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court鈥檚 recent repeal of Roe v. Wade, she said, further muddles police officers鈥 role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

鈥淲e know that law enforcement gets these alerts,鈥 she said. 鈥淚f you are in a state where they are looking to investigate these kinds of incidents, you鈥檝e invited them into a student鈥檚 house to be able to do that.鈥

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students鈥 homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to 社区黑料, district spokesperson Andre Riley said that GoGuardian helps officials 鈥渋dentify potential risks to the safety of individual students, groups or schools,鈥 and that 鈥減roper accountability measures are taken鈥 if students violate the code of conduct or break laws.

鈥淭he use of GoGuardian is not simply a prompt for a law enforcement response,鈥 Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools鈥 reliance on the tools could violate students鈥 civil rights and exacerbate 鈥渢he school-to-prison pipeline by increasing law enforcement interactions with students.鈥 Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be 鈥渋n immediate danger.鈥 In on the company鈥檚 website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said 鈥渢here are limited options鈥 beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

鈥淲hile we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,鈥 in its letter. 鈥淚rrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.鈥 

In its , GoGuardian states the company may disclose student information 鈥渋f we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.鈥 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey鈥檚 release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

鈥淭his is becoming a conversation not just about privacy, but about discrimination,鈥 Laird said. 鈥淲ithout a doubt, we see certain groups of students having outsized experiences in being directly targeted.鈥

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including 鈥済ay鈥 and 鈥渓esbian,鈥 a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination.聽

Center for Democracy and Technology

In its letter to the Education Department鈥檚 Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

鈥淪tudent activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,鈥 the letter states. 

The Education Department鈥檚 civil rights division, they said, should condemn surveillance practices that violate students鈥 civil rights and launch 鈥渆nforcement action against violations that result in discrimination.鈥

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that 鈥渋llegally surveil children when they go online to learn.鈥 

It also comes at a time of intense concern over students鈥 emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

鈥淪chools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,鈥 Laird said. 

Last week, the Senate designed to improve children鈥檚 safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

鈥淭he answer to our lack of privacy isn鈥檛 more tracking,鈥 the . The legislation 鈥渋s a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is 鈥榥ot in their best interest,鈥 as defined by the government, and interpreted by tech platforms.鈥 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, 鈥渨ill now be more heavily surveilled by basically every site on the internet, and that information will be available to parents鈥 who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

鈥淲hen you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,鈥 she said.

]]>
Minneapolis Schools to Halt Controversial Student Surveillance Initiative /article/minneapolis-schools-to-halt-controversial-student-surveillance-initiative/ Mon, 27 Jun 2022 19:56:23 +0000 /?post_type=article&p=692269 The Minneapolis school district has announced plans to end its relationship with Gaggle, a controversial digital surveillance tool that monitored students鈥 online behaviors during pandemic-induced remote learning. 

The announcement, which follows extensive reporting by 社区黑料 about how the tool subjected the city鈥檚 youth to pervasive round-the-clock digital surveillance, was outlined last week at the bottom of a newsletter alerting families to changes at the district. Gaggle, which uses artificial intelligence and human content moderators to track students鈥 online activities and notify district officials of 鈥渋nappropriate behaviors or potential threats to self or others,鈥 will no longer be used beginning on July 1, the district announced. 

A week after schools went remote in Minneapolis and nationally in March 2020, the district sidestepped typical procurement rules and used federal pandemic relief money to contract with Gaggle, a for-profit company that reported significant business growth when classes went online. The district has spent more than $355,000 on the tool, which monitors student behaviors on school-issued Google and Microsoft accounts, and has a contract with the company through September 2023. 

District officials said the tool saved lives but civil rights advocates and students targeted by the program have questioned its efficacy and accused the company of violating students鈥 privacy rights. 

In an email, district spokesperson Julie Schultz Brown attributed the change to 鈥渕ade in order to honor the terms of our new contract鈥 with educators. Gaggle founder and CEO Jeff Patterson said the Minneapolis district will stop using the tool at a moment when 鈥渟tudents across the United States are suffering.鈥 In June, the company alerted Minneapolis officials to 15 鈥渃ritical incidents鈥 related to suicide, death threats, violence and drug use, Patterson wrote in a statement. Nationally, the pandemic has led to a surge in youth mental health issues and . 

A recent report by Democratic Sens. Elizabeth Warren and Ed Markey warned that Gaggle and similar services could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars. Gaggle claims it during the 2020-21 school year, yet independent research on the tool鈥檚 effectiveness doesn鈥檛 exist. 

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Teeth Logsdon-Wallace, a rising freshman in Minneapolis, saw the district鈥檚 decision to cut ties with Gaggle as a major victory. He became an outspoken Gaggle critic after a homework assignment, which discussed a previous suicide attempt and how he learned important coping skills, got flagged by the tool鈥檚 surveillance dragnet. Officials at Gaggle and the district said the tool helps identify students who are struggling emotionally and need adult intervention. But 14-year-old Logsdon-Wallace and other critics argue that digital surveillance is an inappropriate way to pinpoint students who need mental health care. Rather than helping, he said the experience 鈥渇elt violating and gross.鈥 

鈥淲hen you鈥檙e spying on kids and their stuff, especially about mental health stuff, they鈥檙e just going to be more secretive about it,鈥 he said. 鈥淭hat can just cause more danger.鈥

While Gaggle relies on technology to ferret out students with issues like depression, Logsdon-Wallace said that he and other students are more likely to share their mental health struggles with adults at school if there鈥檚 a culture of trust. Monitoring communications through an algorithm and a team of low-paid remote workers who the students don鈥檛 even know, he said, had the opposite effect and left students more apprehensive about district computers, 鈥渨hich could be positive and negative.鈥

While his peers learned how to better protect their own privacy online 鈥渆ven when it鈥檚 inherently being violated,鈥 he said, he worried that some may have been 鈥渂ottling up mental health issues because of it.鈥

The district will no longer use Gaggle鈥檚 student activity monitoring tool or the company鈥檚 anonymous tip line, SpeakUp for Safety, which allows students to report potential safety threats confidentially. Instead of turning to SpeakUp, concerned parents and students should report issues to police officials with the state Bureau of Criminal Apprehension, the district wrote in its newsletter. 

District officials have said the anonymous tip line was central to its decision to contract with Gaggle, yet previous reporting by 社区黑料 found that the service was rarely used. Meanwhile, the digital surveillance tool routinely flagged students who made references to sex, drugs and violence on district technology. An analysis of nearly 1,300 alerts found the service flagged Minneapolis students for discussing violent impulses, eating disorders, abuse at home and suicidal plans. 

But Gaggle regularly flagged benign student chatter and personal files, including classroom assignments, casual conversations between teens and sensitive journal entries. Gaggle flags students who use keywords related to sexual orientation including 鈥済ay鈥 and 鈥渓esbian,鈥 and on at least one occasion school officials in Minneapolis outed an LGBT student to their parents. The sheer volume of student communications that got flagged by Gaggle was at times overwhelming, the Minneapolis school district鈥檚 head of security acknowledged, but he also felt like he was able to save students from dying by suicide. 

In interviews with 社区黑料, former content moderators at Gaggle 鈥 hundreds of whom are paid just $10 an hour on month-to-month contracts 鈥 raised serious questions about the company鈥檚 efficacy, its employment practices and its effect on students鈥 civil rights. 

Moderators said they received little training before they were given access to students鈥 sensitive materials and were pressured to prioritize speed over quality. They also reported insufficient safeguards to protect students鈥 sensitive files, including nude selfies. Patterson acknowledged that moderators, who work remotely with little supervision or oversight, could easily save copies of students鈥 nude photographs and share them on the dark web. 

As a transgender teenager who believes the school district has done too little to address bullying, Logsdon-Wallace said he already had little trust in district leaders. While Gaggle didn鈥檛 address the abuse from peers, having his sensitive experiences caught in the company鈥檚 algorithm made the situation worse.

鈥淭he very little trust I had in the administration is just destroyed,鈥 he said. 鈥淵ou can鈥檛 expect students to trust you if you鈥檝e done nothing to earn that trust.鈥

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it鈥檚 鈥楴ot That Smart鈥 /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone鈥檚 gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since 鈥済raduated鈥 from weekly therapy sessions and has found a better headspace, but that didn鈥檛 stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope 鈥 intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song 鈥淵our Heart is a Muscle the Size of Your Fist鈥 helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was 鈥渁 reminder to keep on loving, keep on fighting and hold on for your life.鈥 (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, 社区黑料 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company鈥檚 digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle鈥檚 surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word 鈥渟uicide,鈥 context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment 鈥 that his mental health had improved 鈥 was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 鈥淚 was trying to be vulnerable with this teacher and be like, 鈥楬ey, here鈥檚 a thing that鈥檚 important to me because you asked,鈥 Logsdon-Wallace said. 鈥淣ow, when I鈥檝e made it clear that I鈥檓 a lot better, the school is contacting my counselor and is freaking out.鈥

Jeff Patterson, Gaggle鈥檚 founder and CEO, said in a statement his company does not 鈥渕ake a judgement on that level of the context,鈥 and while some districts have requested to be notified about references to previous suicide attempts, it鈥檚 ultimately up to administrators to 鈥渄ecide the proper response, if any.鈥  

鈥楢 crisis on our hands鈥

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students鈥 online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students鈥 emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by 社区黑料 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic鈥檚 effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

鈥淏efore the pandemic, we had a crisis on our hands,鈥 he said. 鈥淚 believe there鈥檚 a tsunami of youth suicide headed our way that we are not prepared for.鈥 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there鈥檚 to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace鈥檚 mother Alexis Logsdon didn鈥檛 know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

鈥淭hat was an example of somebody describing really good coping mechanisms, you know, 鈥業 have music that is one of my soothing activities that helps me through a really hard mental health time,鈥欌 she said. 鈥淏ut that doesn鈥檛 matter because, obviously, this software is not that smart 鈥 it鈥檚 just like 鈥榃oop, we saw the word.鈥欌 

鈥楻andom and capricious鈥

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications 鈥 an experience she described as 鈥渞eally scary.鈥

鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of 鈥淚nappropriate Use鈥 while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school鈥檚 literary journal and, according to her, Gaggle had ultimately flagged profanity in students鈥 fictional article submissions. 

鈥淭he link at the bottom of this email is for something that was identified as inappropriate,鈥 Gaggle warned in its email while pointing to one of the fictional articles. 鈥淧lease refrain from storing or sharing inappropriate content in your files.鈥 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn鈥檛 catch everything. Even as she got flagged when students shared documents with her, the articles鈥 authors weren鈥檛 receiving similar alerts, she said. And neither did Gaggle鈥檚 AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle鈥檚 monitoring system is 鈥渞andom and capricious,鈥 and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

鈥淲ith such a seemingly random service, that doesn鈥檛 seem to 鈥 in the end 鈥 have an impact on improving student health or actually taking action to prevent suicide and threats鈥 she said in an interview. 鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times 鈥渄oes not properly indicate the author of a document and assigns a random collaborator.鈥

鈥淲e are hoping Google will improve this functionality so we can better protect students,鈥 Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn鈥檛 notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she鈥檇 shoot her 鈥減uny little brain with my grandpa鈥檚 rifle.鈥

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter鈥檚 teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

鈥淚 didn鈥檛 hear a word from Gaggle about it,鈥 she said. 鈥淚f I hadn鈥檛 brought it to the teacher鈥檚 attention, I don鈥檛 think that anything would have been done.鈥 

The incident, which occurred in April, fell outside the six-month period for which 社区黑料 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it 鈥渄oes not have any insight into the steps the district took to address this particular matter.鈥 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials 鈥渨ould never discuss with a community member any communication flagged by Gaggle.鈥 

鈥淭hat unrelated but concerned parent would not have been provided that information nor should she have been,鈥 she wrote in an email. 鈥淭hat is private.鈥 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

鈥楾he big scary algorithm鈥

When identifying potential trouble, Gaggle鈥檚 algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they鈥檙e delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That鈥檚 where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 This is not Netflix鈥檚 recommendation engine. This is not Spotify.鈥
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

鈥淵ou鈥檙e going to get 25,000 emails saying that a student dropped an F-bomb in a chat,鈥 she said. 鈥淲hat鈥檚 the utility of that? That seems pretty low.鈥 

She said that Gaggle鈥檚 utility could be impaired because it doesn鈥檛 adjust to students鈥 behaviors over time, comparing it to Netflix, which recommends television shows based on users鈥 ever-evolving viewing patterns. 鈥淪omething that doesn鈥檛 learn isn鈥檛 going to be accurate,鈥 she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle鈥檚 marketing materials appear to overhype the tool鈥檚 sophistication to schools, she said. 

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 she said. 鈥淭his is not Netflix鈥檚 recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.鈥 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart.鈥
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle鈥檚 proprietary algorithm is updated regularly 鈥渢o adjust to student behaviors over time and improve accuracy and speed.鈥 The tool monitors 鈥渢housands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.鈥 

Ultimately, the algorithm to identify keywords is used to 鈥渘arrow down the haystack as much as possible,鈥 Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart,鈥 he said. 

In Minneapolis, officials denied that Gaggle infringes on students鈥 privacy and noted that the tool only operates within school-issued accounts. The district鈥檚 internet use policy states that students should 鈥渆xpect only limited privacy,鈥 and that the misuse of school equipment could result in discipline and 鈥渃ivil or criminal liability.鈥 District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor 鈥渢he online activities of minors.鈥 

Patterson suggested that teachers aren鈥檛 paying close enough attention to keep students safe on their own and 鈥渟ometimes they forget that they鈥檙e mandated reporters.鈥 On the , Patterson says he launched the company in 1999 to provide teachers with 鈥渁n easy way to watch over their gaggle of students.鈥 Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company鈥檚 role in meeting it. As technology becomes a key facet of American education, Patterson said that schools 鈥渉ave a moral obligation to protect the kids on their digital playground.鈥 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student 鈥渢racking鈥 through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn鈥檛 be 鈥渃onstrued to require the tracking of internet use by any identifiable minor or adult user.鈥 In , her group urged the government to clarify the Children鈥檚 Internet Protection Act鈥檚 requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they鈥檙e concerned the tools 鈥渕ay extend beyond鈥 the law鈥檚 intent 鈥渢o surveil student activity or reinforce biases.鈥 Around-the-clock surveillance, they wrote, demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥 

鈥淓scalations and mischaracterizations of crises may have long-lasting and harmful effects on students鈥 mental health due to stigmatization and differential treatment following even a false report,鈥 the senators wrote. 鈥淔lagging students as 鈥榟igh-risk鈥 may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.鈥

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd鈥檚 murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle鈥檚 algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by 社区黑料 offer a limited window into Gaggle鈥檚 potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students鈥 digital communications are forwarded to police in rare circumstances. The Minneapolis district鈥檚 internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district鈥檚 Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district鈥檚 director of emergency management, safety and security, said that law enforcement is not a 鈥渞egular partner,鈥 when responding to incidents flagged by Gaggle. It doesn鈥檛 deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by 社区黑料.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

鈥淓ven if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,鈥 Matlock said, though it鈥檚 unclear if any students have faced legal consequences. 鈥淚t鈥檚 the question as to why they鈥檙e doing it,鈥 and to raise the issue with their parents.

Gaggle鈥檚 keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including 鈥済ay, and 鈥渓esbian.鈥 On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident 鈥渄isgusting and horribly messed up.鈥 

鈥淭hey have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it鈥檚 going to be false-positive because they are acting as if the word gay is inherently sexual,鈥 he said. 鈥淲hen people are just talking about being gay, anything they鈥檙e writing would be flagged.鈥 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in 社区黑料鈥檚 data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

鈥淭hat鈥檚 definitely really messed up, especially when the school is like 鈥極h no, no, no, please keep these Chromebooks over the summer,鈥欌 an invitation that gave students 鈥渢he go-ahead to use them鈥 for personal reasons, he said.

鈥淓specially when it鈥檚 during a pandemic when you can鈥檛 really go anywhere and the only way to talk to your friends is through the internet.鈥

]]>
Dems Warn School Surveillance Tools Could Compound 鈥楻isk of Harm for Students鈥 /article/democratic-lawmakers-demand-student-surveillance-companies-outline-business-practices-warn-the-security-tools-may-compound-risk-of-harm-for-students/ Mon, 04 Oct 2021 20:41:00 +0000 /?post_type=article&p=578691 Updated, Oct. 5

A group of Democratic lawmakers has demanded that several education technology companies that monitor children online explain their business practices, arguing that around-the-clock digital surveillance demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥

In to last week, Democratic Sens. Elizabeth Warren, Ed Markey and Richard Blumenthal asked them to explain steps they鈥檙e taking to ensure the tools aren鈥檛 鈥渦nfairly targeting students and perpetuating discriminatory biases,鈥 and comply with federal laws. The letters went to executives at Gaggle, Securly, GoGuardian and Bark Technologies, each of which use artificial intelligence to analyze students鈥 online activities and identify behaviors they believe could be harmful.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淓ducation technology companies have developed software that are advertised to protect student safety, but may instead be surveilling students inappropriately, compounding racial disparities in school discipline and draining resources from more effective student supports,鈥 the lawmakers wrote in the letters. Though the tools are marketed as student safety solutions 鈥 and grew rapidly as schools shifted to remote learning during the pandemic 鈥 there’s . Some critics, including the lawmakers, argue they may do more harm than good. 鈥淭he use of these tools may break down trust within schools, prevent students from accessing critical health information and discourage students from reaching out to adults for help, potentially increasing the risk of harm for students,鈥 the senators wrote.

The letters cited a recent investigation by 社区黑料, which outlined how Gaggle鈥檚 AI-driven surveillance tool and human content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In Minneapolis, the company notified school security when it identified students who made references to suicide, self-harm and violence. But it also analyzed students鈥 classroom assignments, journal entries, chats with friends and fictional stories.

Each of the companies offer differing levels of remote student surveillance. Gaggle, for example, analyzes emails, chat messages and digital files on students鈥 school-issued Google and Microsoft accounts. Other services include students鈥 social media accounts and web browsing history, among other activities.

The letters were particularly critical of the tools鈥 capacity to track student behaviors 24/7 鈥 including when students are at home 鈥 and their ability to monitor students on their personal devices in some cases.

Schools鈥 use of digital monitoring tools has become commonplace in recent years. More than 80 percent of teachers reported using the tools, according to a recent survey by the Center for Democracy and Technology. Among those who participated in the survey, nearly a third reported that they monitor student activity at all hours of the day and just a quarter said it was limited to school hours.

鈥淏ecause of the lack of transparency, many students and families are unaware that nearly all of their children鈥檚 online behavior is being tracked,鈥 according to the letters. 鈥淲hen students and families are aware, they are often unable to opt out because school-issued devices are given to students with the software already installed, and many students rely on these devices for remote or at-home learning.鈥

A Securly spokesperson said in an email the company is 鈥渞eviewing the correspondence received鈥 by the lawmakers and is in the process of responding to their requests for information. He said the company is 鈥渄eeply committed to continuously evolving our technology鈥 to help schools protect students online. A Gaggle spokesperson said the company appreciates the lawmakers鈥 interest in learning how the tool 鈥渟erves as an early warning system to help school districts prevent tragedies such as suicide, acts of violence, child pornography and other dangerous situations.鈥 A GoGuardian spokesman said the company cares “deeply about keeping students safe and protecting their privacy.”

Bark officials didn鈥檛 respond to requests for comment.

The Clinton-era , passed in 2000, requires schools to filter and monitor students鈥 internet use to ensure they aren鈥檛 accessing material that is 鈥渉armful to minors,鈥 such as pornography. Student privacy advocates have long argued that a newer generation of AI-driven tools go beyond the law鈥檚 scope and have urged federal officials to clarify its requirements. The law includes a disclaimer noting that it does not 鈥渞equire the tracking of internet use by any identifiable minor or adult user.鈥 It 鈥渞emains an open question鈥 as to whether schools鈥 use of digital tools to monitor students at home violates Fourth Amendment protections against unreasonable searches and seizures, according to a by the Future of Privacy Forum.

In their letters, senators highlighted how digital surveillance tools could perpetuate several educational inequities. For example, the tools could have a disproportionate impact on students of color and further uphold longstanding racial disparities in student discipline.

鈥淪chool disciplinary measures have a long history of disproportionately targeting students of color, who face substantially more punitive discipline than their white peers for equivalent offenses,鈥 according to the letters. 鈥淭hese disciplinary records, even when students are cleared, may have life-long harmful consequences for students.鈥

Meanwhile, the tools may have a larger impact on low-income students who rely on school technology to access the internet than those who can afford personal computers. Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, said their research 鈥渞evealed a worrisome lack of transparency鈥 around how these educational technology companies track students online and how schools rely on their tools.

鈥淩esponses to this letter will help shine a light on these tools and strategies to mitigate the risks to students, especially those who are most reliant on school-issued devices,鈥 she said in an email.

]]>
Report: Most Parents, Teachers Support Student Surveillance Tech /article/new-research-most-parents-and-teachers-have-accepted-student-surveillance-as-a-safety-tool-but-see-the-potential-for-serious-harm/ Tue, 21 Sep 2021 16:30:00 +0000 /?post_type=article&p=577984 Tools that monitor students鈥 online behavior have become ubiquitous in U.S. schools 鈥 and grew rapidly as the pandemic closed campuses nationwide 鈥 but a majority of parents and teachers believe the benefits of such digital surveillance outweigh the risks, .

Similarly, half of students said they are comfortable with schools鈥 use of monitoring software while a quarter reported feeling queasy about the idea, according to the new research by the Center for Democracy and Technology, a nonprofit group based in Washington, D.C. Despite their overall comfort with digital software, teachers, parents and students each worried about how the tools could have detrimental side effects. Specifically, many parents and teachers were concerned that digital surveillance could be used to discipline students and young people reported becoming more reserved when they knew they were being watched.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淚n response to the pandemic, the focus on technology and its use has never been greater,鈥 said report co-author Elizabeth Laird, the center鈥檚 director of equity in civic technology. As tech gains a greater grasp on education, she said it鈥檚 important for school leaders and policymakers to remain focused on protecting students鈥 individual rights. She worried that student surveillance technology could have a damaging impact on students, especially youth of color and those from low-income households.

鈥淚 don鈥檛 think it鈥檚 a slam dunk,鈥 Laird said.

Though the report didn鈥檛 highlight specific tools used, schools deploy a range of digital monitoring software to track student activity, including programs that block online material deemed inappropriate, track when students log into school applications, and allow teachers to view students鈥 screens in real-time and even take control of their computers.

Last week, an investigative report by 社区黑料 exposed how the Minneapolis school district鈥檚 use of the digital surveillance tool Gaggle had subjected children to relentless online surveillance long after classes ended for the day 鈥 including inside students鈥 homes. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day by sifting through data stored on their school-issued Google and Microsoft accounts. In Minneapolis, the company flagged school security when moderators believed students could harm themselves or others, but it also picked up students鈥 classroom assignments, journal entries, chats with friends and fictional stories.

Among teachers surveyed by the Center for Democracy and Technology, 81 percent said their schools use software that tracks students鈥 computer activity, including to block obscene material, monitor students鈥 screens in real time and prohibit students from using websites unrelated to school like YouTube. A majority of both parents and students reported such tools were used in their schools, but they were also more likely than teachers to be unsure about whether youth were being actively monitored by educators. In interviews with administrators, researchers found that many school leaders weren鈥檛 sure how best to be transparent with families about their monitoring practices.

鈥淐ertainly there is an imbalance in information and transparency around what is happening,鈥 Laird said. School districts have been clear [that] students shouldn鈥檛 have an expectation of privacy but they haven鈥檛 been as clear about what they are tracking, how they are tracking it, how long they keep that information. They really should be doing that.鈥

Four-fifths of surveyed teachers said their schools used digital tools to track students online. Both parents and students were more unlikely than teachers to be unsure whether such tools were in use in their schools. (Photo courtesy Center for Democracy and Technology)

Among teachers, 66 percent said the benefits of activity monitoring outweigh student privacy concerns and 62 percent of parents reached a similar conclusion. Meanwhile, 78 percent of teachers reported that digital surveillance helps keep students safe by identifying problematic online behaviors and 72 percent said it helps keep students on task. But their answers also revealed equity concerns: 71 percent of teachers reported that monitoring software is applied to all students equally, 51 percent worried that it could come with unintended consequences like 鈥渙uting鈥 LGBTQ students and 49 percent said it violates students鈥 privacy.

Many teachers reported that such monitoring tools are used on students long after classes end for the day. In total, 30 percent of educators said the tools are active 鈥渁ll of the time,鈥 and 16 percent said the software tracks kids on their personal devices.

Nearly a third of teachers who reported their schools use digital services like Gaggle to track students online said the tools monitor youth behaviors 24 hours a day. (Photo by Center for Democracy and Technology)

Among parents, 75 percent said digital surveillance helps keep students safe and 73 percent said it ensures children remain focused on schoolwork. Yet many parents also reported potential downsides: 61 percent worried of long-term harm if the tools were used to discipline students, 51 percent were concerned about unintended consequences and 49 percent said it violates students鈥 privacy rights.

Perhaps unsurprisingly, students were less at ease with educators watching their online behaviors. Half said they were comfortable with monitoring tools, a quarter said they were uncomfortable with them and another quarter were unsure.

The data also suggest that students alter their behaviors as a result of being watched: 58 percent said they don鈥檛 share their true thoughts or ideas online as a result of being monitored at school and 80 percent said they were more careful about what they search online. While just 39 percent of students said it was unfair that educators monitored their school-issued services, 74 percent opposed the surveillance of their own devices like their cell phones. are among those that could track students鈥 behaviors on their own technology.

The data raise significant equity concerns. For many students, school-issued devices are their only method of connectivity.

鈥淭he privacy and security of personal devices is a luxury not all can afford,鈥 Alexandra Givens, the center鈥檚 president and CEO, said in a press release. 鈥淐onstant online monitoring 鈥 especially of students who cannot afford or don鈥檛 have access to personal devices 鈥 risks creating disparities in the ways student privacy is protected nationwide.鈥

To reach its findings, researchers conducted online surveys in June that were completed by 1,001 teachers, 1,663 parents and 420 high school students. Researchers also conducted interviews with school administrators to understand their motives in deploying digital surveillance. Among the justifications is a federal law that requires schools to monitor students online. But the law also includes a disclaimer noting that the statute does not 鈥渞equire the tracking of internet use by any identifiable minor or adult user.鈥

Understanding context is critical, Laird said, adding that the law鈥檚 authors hadn鈥檛 fully envisioned a world where students could be surveilled by artificial intelligence long after classes end for the day.

鈥淲hat was happening at the time was students were in a school computer lab for part of the day and monitoring meant having an adult walking around a computer lab and physically looking at what was on students鈥 computer monitors,鈥 she said. But today, she said the statute is being interpreted very differently.

In response, the center, along with the American Civil Liberties Union and the Center for Learner Equity Tuesday to clarify the law鈥檚 stipulations and inform educators it 鈥渄oes not require broad, invasive and constant surveillance of students鈥 lives online.鈥

鈥淪ystemic monitoring of online activity can reveal sensitive information about students鈥 personal lives, such as their sexual orientation, or cause a chilling effect on their free expression, political organizing, or discussion of sensitive issues such as mental health,鈥 the letter continued. 鈥淭hese harms likely fall disproportionately on already vulnerable, over-policed and over-disciplined communities.鈥

]]>