This is what high school teachers see when they open GoGuardian, a popular software application used to monitor student activity: The interface is familiar, like the gallery view of a large Zoom call. But instead of seeing teenaged faces in each frame, the teacher sees thumbnail images showing the screens of each student’s laptop. They watch as students’ cursors skim across the lines of a sonnet or the word “chlorofluorocarbon” appears, painstakingly typed into a search bar.
If a student is enticed by a distraction—an online game, a stunt video—the teacher can see that too and can remind the student to stay on task via a private message sent through GoGuardian. If this student has veered away from the assignment a few too many times, the teacher can take remote control of the device and zap the tab themselves. Student-monitoring software has come under renewed scrutiny over the course of the Covid-19 pandemic.
When students in the US were forced to continue their schooling virtually, many brought home school-issued devices. Baked into these machines was software that can allow teachers to view and control students’ screens , use AI to scan text from student emails and cloud-based documents, and, in severe cases, send alerts of potential violent threats or mental health harms to educators and local law enforcement after school hours. Now that the majority of American students are finally going back to school in-person, the surveillance software that proliferated during the pandemic will stay on their school-issued devices, where it will continue to watch them.
According to a report published today from the Center for Democracy and Technology, 89 percent of teachers have said that their schools will continue using student-monitoring software, up 5 percentage points from last year. At the same time, the overturning of Roe v. Wade has led to new concerns about digital surveillance in states that have made abortion care illegal.
Proposals targeting LGBTQ youth, such as the Texas governor’s calls to investigate the families of kids seeking gender-affirming care , raise additional worries about how data collected through school-issued devices might be weaponized in September. The CDT report also reveals how monitoring software can shrink the distance between classrooms and carceral systems. Forty-four percent of teachers reported that at least one student at their school has been contacted by law enforcement as a result of behaviors flagged by the monitoring software.
And 37 percent of teachers who say their school uses activity monitoring outside of regular hours report that such alerts are directed to “a third party focused on public safety” (e. g. , local police department, immigration enforcement).
“Schools have institutionalized and routinized law enforcement’s access to students’ information,” says Elizabeth Laird, the director of equity in civic technology at the CDT. US senators Elizabeth Warren and Ed Markey have recently raised concerns about the software’s facilitation of contact with law enforcement, suggesting that the products may also be used to criminalize students who seek reproductive health resources on school-issued devices. The senators have sought responses from four major monitoring companies: GoGuardian, Gaggle, Securly, and Bark for Schools, which together reach thousands of school districts and millions of American students.
Widespread concerns about teen mental health and school violence lend a grim backdrop to the back-to-school season. After the mass shooting at an elementary school in Uvalde, Texas, Congress passed a law that directs $300 million for schools to strengthen security infrastructure. Monitoring companies speak to educators’ fears, often touting their products’ ability to zero in on would-be student attackers.
Securly’s website offers educators “AI-powered insight into student activity for email, Google Drive, and Microsoft OneDrive files. ” It invites them to “approach student safety from every angle, across every platform, and identify students who may be at risk of harming themselves or others. ” Before the Roe decision brought more attention to the risks of digital surveillance, lawmakers and privacy advocates were already concerned about student-monitoring software.
In March 2022, an investigation led by senators Warren and Markey found that the four aforementioned companies—which sell digital student-monitoring services to K-12 schools—raised “significant privacy and equity concerns. ” The investigation pointed out that low-income students (who tend to be disproportionately Black and Hispanic) rely more heavily on school devices and are exposed to more surveillance than affluent students; it also uncovered that schools and companies were often not required to disclose the use and extent of their monitoring to students and parents. In some cases, districts can opt to have a company send alerts directly to law enforcement instead of a school contact.
Students are often unaware that their AI hall monitors are imperfect and can be misused. An investigation by The 74 Million found that Gaggle would send students warning emails for harmless content, like profanity in a fiction submission to the school literary magazine. One high school newspaper reported that the district used monitoring software to reveal a student’s sexuality and out the student to their parents.
(Today’s CDT report revealed that 13 percent of students knew someone who had been outed as a result of student-monitoring software. ) A Texas student newspaper’s editorial board argued that their school’s use of the software might prevent students from seeking mental health support. “Surveillance always comes with inherent forms of abuse.
” Evan Greer, director of Fight for the Future Also disquieting are the accounts of monitoring software breaching students’ after-school lives. One associate principal I spoke to for this story says his district would receive “Questionable Content” email alerts from Gaggle about pornographic photos and profanities from students’ text messages. But the students weren’t texting on their school-issued Chromebooks.
When administrators investigated, they learned that while teens were home, they would charge their phones by connecting them to their laptops via USB cables. The teens would then proceed to have what they believed to be private conversations via text, in some cases exchanging nude photos with significant others—all of which the Gaggle software running on the Chromebook could detect. Now the school advises students not to plug their personal devices into their school-issued laptops.
This pervasive surveillance has always been disconcerting to privacy advocates, but the criminalization of reproductive health care in some states makes those problems more acute. It’s not difficult to envision a student who lives in a state where ending a pregnancy is illegal using a search engine to find out-of-state abortion clinics, or chatting online with a friend about an unplanned pregnancy. From there, teachers and administrators could take it upon themselves to inform the student’s parent or local law enforcement.
So could the monitoring algorithm scan directly for students who type “abortion clinic near me” or “gender-affirming care” and trigger an alert to educators or the police? Paget Hetherington, the vice president of marketing at Gaggle, says that Gaggle’s dictionary of keywords does not scan for words and phrases related to abortion, reproductive health care, or gender-affirming health care. Districts can, to an extent, ask Gaggle to customize and localize which keywords are flagged by the algorithm. When WIRED asked whether a district could request that Gaggle specifically track words related to reproductive or gender-affirming health care, Hetherington replied, “It’s possible that a school district in one of these states could potentially ask us to track some of these words and phrases, and we will say no.
” When reached for comment, GoGuardian directed us to the following statement: “As a company committed to creating safer learning environments for all students, GoGuardian continually evaluates our product frameworks and their implications for student data privacy. We are currently reviewing the letter we received from Senators Warren and Markey and will be providing a response. ” When reached for comment, Bark for Schools initially agreed to speak to us, then went silent.
WIRED sent additional requests for comment, which received no reply. Securly did not respond to requests for comment. Even if student-monitoring algorithms don’t actively scan for content related to abortion or gender-affirming care, the sensitive student information they’re privy to can still get kids in trouble with police.
“It is hardly a stretch to believe that school districts would be compelled to use the information that they collect to ensure enforcement of state law,” says Doug Levin, national director of the K12 Security Information Exchange, a nonprofit focused on protecting schools from emerging cybersecurity risks. Schools can and do share student data with law enforcement. In 2020, The Boston Globe reported that information about Boston Public School students was shared on over 100 occasions with an intelligence group based in the city’s police department, exposing the records of the district’s undocumented students and putting those students at greater risk of deportation.
When it comes to safeguarding the privacy of students’ web searches and communications, Levin says current federal protections are insufficient. The primary federal law governing the type and amount of student data that companies can slurp up is the Family Education Rights and Privacy Act. While FERPA has been updated a handful of times since it passed in 1974, Levin says it hasn’t kept pace with the technology that shapes reality for schools and students in 2022.
The current national privacy bill in Congress (which might, in other respects, actually be good ) won’t do anything for most students either, as it excludes public institutions such as public schools and vendors that handle student data. For teachers, the value of remote monitoring can be significant. Stacy Recker, a high school social studies teacher in the Cincinnati area, says GoGuardian was “invaluable” during the pandemic.
She used the software to provide remote support for students who struggled with the technical demands of remote learning. Now that her students have returned to the classroom, she continues to use GoGuardian to help her kids stay off YouTube and focus on a lesson on W. E.
B. DuBois. At the time of WIRED’s interview, she was not aware of the alerting system that claims to detect a student’s risk of self-harm or harm to others, a service GoGuardian offers as a separate product.
Educators are shouldering the unprecedented responsibility of helping students recover from two extremely disruptive years while providing mental health support in the wake of campus tragedies. The monitoring companies’ websites share stories of their products flagging students’ expressions of suicidal ideation, with testimonies from teachers who credit the software with helping them intervene in the nick of time. Especially after school shootings, educators are understandably fearful.
But the evidence that monitoring software actually helps prevent violence is scant . Privacy advocates would also argue that forcing schools to weigh surveillance against safety perpetuates a false choice. “Surveillance always comes with inherent forms of abuse,” says Evan Greer, the director of the nonprofit Fight for the Future.
“There are other ways to support and protect kids that don’t. ” Some educators would agree. When the Columbine shooting shook American schools in 1999, Lee Ann Wentzel was an assistant principal at Ridley Public Schools in Pennsylvania.
She remembers how her school scrambled to come up with new safety protocols, like issuing ID badges. When she became superintendent in 2010, Wentzel helped design a rigorous student privacy rubric against which her district could measure all software they would be using with students. The rubric included items like whether the student’s data was disposed of and whether it was shared with other parties.
Her district does not use GoGuardian, Gaggle, Securly, or Bark for Schools. She’s wary of the promises student-monitoring companies make. “Those systems provide A) a false sense of security, and B) it kills the curiosity that you want to inspire in learning,” she says.
“If you’re going to rely on a technology system to tell you a kid’s unhappy, that’s concerning to me because you’re not developing relationships with kids who are in front of you. ” As to the companies’ claims about bolstering safety and anticipating school violence, she says, “There’s no single answer to these issues. Anyone that promises, ‘We’re gonna be able to predict that sort of thing’—No.
You’re not. ”.
From: wired
URL: https://www.wired.com/story/student-monitoring-software-privacy-in-schools/