As a child in the United States, he studies online, with apps watching his every move
For New York teacher Michael Flanagan, the pandemic was a freeway of new technology — a rush of laptops to stay-at-home students and hectic school life via the Internet.
Students have long since returned to school, but technology has lived on, and with it a new generation of apps that monitor students online, sometimes around the clock and even during days shared at home with family and friends.
The programs scan students’ online activity, social media posts and more — with the goal of keeping them focused, detecting mental health issues and flagging potential for violence.
“You can’t unlock the bell,” said Flanagan, who teaches social studies and economics. “Everyone has a device.”
But a new trend in tracking has raised fears that some apps may target minority students, while others have left out LGBT+ students without consent, and many are being used for discipline and care.
That’s why Flanagan has parted ways with many of his colleagues and won’t use such apps to monitor his students online.
He recalled seeing a demonstration of one such program, GoGuardian, in which a teacher showed what a student was doing on his computer in real time. The child was at home on a day off.
Such scrutiny raised a major red flag for Flanagan.
“I have a school-issued device and I know there’s no expectation of privacy. But I’m a grown man – these kids don’t know that,” he said.
A spokesperson for the New York City Department of Education said GoGuardian Teacher is used “only to allow teachers to see what is currently on the student’s screen, provide refocusing instructions, and limit access to inappropriate content.”
The $1 billion-plus GoGuardian — one of the few high-profile apps on the market — currently monitors more than 22 million students in public systems in New York City, Chicago and Los Angeles, among others.
Globally, the education technology sector is expected to grow by $133 billion from 2021 to 2026, market researcher Technavio said last year.
GoGuardian said in a statement that parents expect schools to keep children safe in the classroom or on field trips, and that schools “have a responsibility to keep students safe in digital spaces and on school-issued devices.”
The company says it “gives educators the ability to protect students from harmful or explicit content.”
Today, online surveillance is “just part of the school environment,” said Jamie Gorosh, a policy adviser at the Future of Privacy Forum, a watchdog group.
And even if schools do overcome the epidemic, “it doesn’t look like we’re going back,” he said.
Guns and depression
A key priority of the monitoring is to keep students engaged in their academic work, but it also addresses rapidly growing concerns about school violence and children’s mental health, which health groups have called a national emergency in 2021.
According to federal data released this month, 82% of schools train staff to recognize mental health issues, up from 60% in 2018; 65% have a confidential threat reporting system, an increase of 15% over the same period.
In a survey last year by the nonprofit Center for Democracy and Technology (CDT), 89% of teachers reported that their schools monitor students’ online activities.
Yet it’s not clear that the software will create safer schools.
Gorosh cited the May shooting in Uvalde, Texas, in which 21 people died at a school that had invested heavily in surveillance technology.
Some worry that tracking apps can actively cause harm.
The CDT report found, for example, that while administrators overwhelmingly say the purpose of monitoring software is for student safety, “it’s far more often used for disciplinary purposes … and we see a disparity based on race,” Elizabeth said. Laird, director of CDT’s Equity in Civic Technology program.
The programs’ use of artificial intelligence to search for keywords prevented LGBT+ students from consenting, he said, noting that 29% of LGBT+ students said they or someone they knew had experienced this.
And more than a third of teachers said their schools automatically alert law enforcement outside of school hours.
“The stated goal is to keep students safe, and we’ve created a system here that streamlines law enforcement’s access to that information and looking for reasons to go into students’ homes,” Laird said.
A report by federal lawmakers last year on four student monitoring software companies found that none made an effort to find out whether the programs disproportionately target marginalized students.
“Students should not be monitored on the same platforms they use to educate them,” Massachusetts Sen. Ed Markey, one of the report’s co-authors, said in a statement to the Thomson Reuters Foundation.
“As school districts work to integrate technology into the classroom, we must ensure that children and teenagers are not preyed upon by a web of targeted advertising or intrusive surveillance of any kind.”
The Department of Education has committed to issuing guidelines on the use of AI earlier this year.
A spokesman said the agency is “committed to protecting the civil rights of all students.”
In addition to the ethical issues surrounding child spying, many parents are frustrated by the lack of transparency.
“We need more clarity about whether data is being collected, especially sensitive data. At the very least, you should be notified and probably consent,” said Cassie Creswell, head of the Illinois Families for Public Schools advocacy group.
Creswell, who has a daughter in a Chicago public school, said several parents were alerted to their children’s online searches, even though they were not first asked or told about the surveillance.
Another child was repeatedly warned not to play a certain game — even though the student was playing at home on the family’s computer, he said.
Creswell and others acknowledge that the problems being addressed by monitoring—bullying, depression, violence—are real and need to be addressed, but they question whether technology is the answer.
“When we’re talking about monitoring self-harm, is that the best way to approach the problem?” Gorosh said.
Pointing to evidence that AI is imperfect at picking up warning signs, he said increased funding for school counselors could be more narrowly tailored to the problem.
“There are huge concerns,” he said. “But technology may not be the first step to answering some questions.”