Dec 06, 2024
It was early in the school day when a 17-year-old gunman began firing into a classroom in the art complex of Santa Fe High School, roughly 30 miles southeast of Houston, in May 2018.
He terrorized fellow students and teachers for about 30 minutes before surrendering to police, killing 10, injuring 13 others and leaving the town of 13,000 to mourn.
Amidst calls from parents to ensure students’ safety after the shooting, the Santa Fe Independent School District school board approved $2.1 million for security and building upgrades. That included the use of facial recognition technology capable of alerting officials if school cameras detected anyone who had been banned from district property. The school district in neighboring Texas City hired a former secret service agent to consult on security and likewise adopted facial recognition.
It’s the same technology that New York banned for use in schools in 2023 at the behest of student privacy advocates and parents.
While security companies — and some school districts — frame facial recognition as a powerful tool in preventing school shootings and saving lives, they are opposite a movement of students, technologists and civil rights advocates who see it as a dystopia-tinged addition to already heavily surveiled schools.
This past summer, a coalition of organizations held demonstrations against school-based facial recognition in four states and Washington, D.C. Fight for the Future, which advocates for online privacy protections, is among the groups that have united to pressure the U.S. Department of Education to officially recommend against the use of facial recognition in K-12 schools.
Caitlin Seeley George, campaigns and managing director at Fight for the Future, says that facial recognition technology companies began increasingly marketing their services to school districts during the COVID-19 pandemic as a means to monitor whether students were wearing face masks or to take attendance.
The expansion of facial recognition in schools is part of a “technosolutionism” belief that technology is the answer to any problem, she says, despite it being “clearly unnecessary."
“The cost of expanding the use of this technology far outweighs the alleged benefits,” Seeley George says. “The impact on students in terms of erosion of privacy, the chilling effect that it can have, the potential to misidentify students and the way it gives a clear pathway from student behavior to discipline and punishment in the school-to-prison pipeline is too far a risk to take. That’s why we think students, teachers and staff should not be subjected to this surveillance technology, and it shouldn't be used at all.”
Clarence Okoh, senior associate at the Georgetown Law Center on Privacy and Technology, says that school surveillance companies tend to make marketing pushes after school shootings.
The school surveillance industry does an estimated $3.1 billion worth of business annually, he adds, and a poll of teachers found that more than 40 percent of students had been contacted by law enforcement at least once as a result of surveillance programs.
Okoh says that the practice of surveilling students — most commonly through programs that monitor what they type on school computers — in tandem with increasing law enforcement doesn’t lead to students being safer. Rather, its largest impact is sending more students through the juvenile justice system.
“Any conversation about safety that begins with surveillance or policing is beginning in the wrong place,” Okoh says. “I came out of law school suing police departments that were engaged in systematic rights violations. And one thing about the police is that they never want resources taken away, even if the resources aren't helpful, even if the resources are violating people's rights. So there is also a self-interest at play with surveillance technology.”
Technology made to detect e-cigarette or vape smoke in school bathrooms, for instance, could end with a student being cited by school police officers and referred to specialized teen vaping courts on charges of nicotine possession.
Why, then, is surveillance relied on so heavily as a school safety measure?
“I think the short answer is police are, in most communities, the most well-funded public service that's available,” Okoh says, “so in the absence of mental [and] behavioral health care, robust after-school programming, other things keep young people safe, arts programming, actual social infrastructures for care — we turn to law enforcement because they're the only thing that's available.”
The campaign against facial recognition in schools gained steam last year, Seeley George says, when the Biden administration directed government agencies to develop policies on how artificial intelligence can or should be used within each department. It created an opportunity for the Department of Education to come out against facial recognition in schools, she says.
After the presidential election and the announcement of President-elect Donald Trump’s education secretary nominee, Seeley George wrote to EdSurge via email that “we still see a lot of work that state boards of education can do, including following the steps that New York has already taken, to protect students from surveillance technology like facial recognition.”
One voice that has too often been left out of the conversation around facial recognition’s use in schools is that of the students who are being monitored, says 17-year-old Jia, a high school senior in New York. (Jia requested to be identified by her first name only due to her parents’ concerns about her privacy.)
Jia joined protests this summer against facial recognition technology organized by Encode Justice, a youth-led nonprofit that advocates for privacy-centered policy on artificial intelligence.
While school districts are adopting facial recognition technology as a safety measure against school shootings, Jia says she feels its use creates fear among students.
“I know a lot of people who go to public schools who already have intensified surveillance technologies. In New York public schools, especially in certain districts, there are a lot of metal detectors, a lot of security around, and I think it creates a chilling effect,” Jia says, “where people feel like they aren't able to completely express themselves. It more feels like — I wouldn't say [like] prison — but very intense monitoring of people. I think also if you go to a school in a certain state where there are risks to your rights, like LGBTQ+ rights or freedom of speech, that is very scary as well.”
Jia says she has met students through Encode Justice who say they have been misidentified by facial recognition technology in their schools and were sent to the principal’s office for discipline.
As a Black and Asian girl, she says stories of Black people being misidentified through facial recognition cameras — like when facial recognition software mistakenly led to the arrest of a pregnant Detroit woman in a carjacking case — make the technology’s use feel unsafe.
Seeley George, of Fight for the Future, likewise says students she’s talked to are skeptical that facial recognition technology improves their safety.
“Especially for kids who are in school now, and who have grown up using technology, they understand that there are negative impacts to a lot of technology in our day-to-day life,” Seeley George says. “It wasn't so long ago that people were posting on social media without thinking that future potential employers will be reading what you post, and now that's a fairly common practice. Now students are thinking, ‘Is it possible that a future employer will have access to video footage of me walking through high school or me in one of my classrooms looking bored out the window?’”
After the shooting at Santa Fe High School, parents packed school board meetings urging the district to increase safety measures. Some had lost children in the shooting, and others had received goodbye text messages from those among the school’s roughly 1,400 students. (Parents of the now-23-year-old suspect, who is being held at a state mental health facility, were recently found not liable in the shooting.)
Santa Fe Independent School District purchased facial recognition technology as part of a security overhaul the following year. It employed the technology for four years, until costs led to the district ending the service.
Ruben Espinoza, chief of police for Santa Fe ISD, says he would have continued the use of facial recognition technology if the budget had allowed and would recommend it to every school district.
The system worked by first allowing the police department to create a “photo bank” with images of people who were not allowed on school district property. The facial recognition software then compared the faces of everyone seen on its cameras against that photo bank and could alert personnel like Espinoza when a banned person was detected.
Espinoza says facial recognition technology practices at school districts should ensure that data isn’t stored beyond the time it takes for the system to determine if a person is in the “photo bank” or not.
To give a sense of the technology’s capabilities, Espinoza says a photo of him as a 21-year-old newly minted officer was one of the images used to test the system when it was first installed.
“It used a photograph that was 30 years old, and it still recognized me, so that's how confident I am in the system,” he says. “Am I saying that it's perfect? No, but if it does alert, you still need that human element to look at it to confirm the alert. We have to get someone to look at that alert, validate whether that's the same person, and then act accordingly.”
The facial recognition system pinged a few times but wasn’t involved in responding to any major incidents on school property during the four years it was used by the district, Espinoza says. He feels it was nonetheless an important tool, one that is “mischaracterized by opponents.”
“Were there major incidents involving weapons or anything like that? No, but these are all preventative methods,” he says. “The best way to stop an active shooter event is to be proactive, to prevent it to begin with. I can sit here and tell you how many incidents where we captured somebody, but we can't measure how many crimes we actually prevented.”
Espinoza hopes the federal government will eventually help remove the financial burden of facial recognition by making grant funding available to pay for it.
The district couldn’t afford to replace all its security cameras with those capable of facial recognition but chose strategic locations for those that were installed, Espinoza says. Even so, the annual cost to license the technology at $1,800 per camera eventually put it out of the district’s reach.
Corey Click, interim technology director at Santa Fe ISD, says he wishes facial recognition was more affordable for school districts: “This is merely a high-powered tool that could be used on any level — in a drug deal or a vandalism or anything — to help identify something quickly to resolve an incident or an investigation.”
©
WT Finance Institute.
All Rights Reserved.
Sitemap