When school began in Lockport, New York, this past fall, the halls were lined not just with posters and lockers, but cameras. Over the summer, a brand new $4 million facial recognition system was installed by the school district in the town’s eight schools from elementary to high school. The system scans the faces of students as they roam the halls, looking for faces that have been uploaded and flagged as dangerous. "Any way that we can improve safety and security in schools is always money well spent," David Lowry, president of the Lockport Education Association, told the Lockport Union-Sun & Journal.
Two months before those students filed into their schools in Lockport, Taylor Swift performed a concert at the Rose Bowl, in California. Swift too turned to facial recognition for protection, reportedly harnessing the power of cameras in kiosks at the venue to scan for faces that her security team had identified to be potentially dangerous.
These two settings might seem pretty different—a vast concert venue in California and a small school district upstate don’t have a ton in common (aside from perhaps a handful of Taylor Swift fans). But they’re emblematic of something that’s happening more and more in the United States right now: people turning to surveillance, and specifically biometric surveillance like facial recognition technology, as a result of the US’s inability to take action to stop gun violence.
Both Swift and those students have reason to worry for their safety. The pop star has had to deal with multiple fans and stalkers who have threatened violence against her. Concert venues have been the location of gun violence before, most notably the Mandalay Bay shooting in Vegas. In a piece for Elle, Swift wrote, “After the Manchester Arena bombing and the Vegas concert shooting, I was completely terrified to go on tour this time because I didn’t know how we were going to keep 3 million fans safe over seven months. There was a tremendous amount of planning, expense, and effort put into keeping my fans safe.” (And there is a crucial link here between gun violence and gender. Fifty women in the United States are shot each month, on average, by an intimate partner, and American women are 16 times more likely to be shot compared to other high-income countries.)
Students are even more commonly the victims of gun violence in the United States, with so many examples I’m not even sure which ones to cite here—Sandy Hook Elementary, Marjory Stoneman Douglas High School, Columbine; the list is incredibly long. Both Swift and the students in Lockport know that walking into a concert or school is dangerous. And both are trying to solve the problem as best they can.
I don’t know how Taylor Swift feels about facial recognition, but the people of Lockport don’t love this solution. Jim Schultz, a parent in Lockport, told me that the Facebook pages for the town boiled with controversy as the camera system was being installed. Parents referenced Big Brother, spying, and privacy concerns. But advocates and even skeptics always wound up coming back to the same conclusion: We have to do something. The subtext being that, especially in Lockport, which is firmly located in Trump country, that something will never be gun control. (The Lockport system remains offline, after parents and the NYCLU convinced the New York State Department of Education to demand a “privacy assessment” of the setup.)
Despite the vast majority of Americans being in favor of gun control measures, the possibility of actual regulation can seem impossible to the point of bleak parody. In 2014, following the Isla Vista shooting, the Onion published an article with the headline “‘No Way To Prevent This,’ Says Only Nation Where This Regularly Happens,” which has been shared during nearly every mass shooting since—about 40 of them. America’s inability to make any progress on gun control, thanks largely to the efforts of the NRA, which spent $9.6 million on lobbying over the past two years, has forced everybody from Taylor Swift to parents in a small town in upstate New York to turn elsewhere when it comes to their safety. Very few people like facial recognition software, but it’s “better than nothing.” And nothing is what seems to be possible when it comes to limiting access to guns.
“It's really strange how many people put gun rights forward as the reason they stand against gun control legislation, but will in the same breath advocate for increased presence of facial recognition cameras and body scanners and gait recognition tools, as if that somehow weren't an imposition on our protection against unreasonable search and seizure,” says Damien Williams, a technology ethicist at Virginia Tech. “It seems to indicate that folks are perfectly willing to let go of some rights in the name of safety, just not the rights that would actually help curtail the danger at its source.”
The tradeoffs between surveillance and safety aren’t new. Americans have worried about phone tapping and video surveillance in the name of protection by the state for decades. After 9/11 there was plenty of ink spilled about these tradeoffs in the context of airport security and the merits of what some scholars call “surveillance theater.” But most of these conversations center on surveillance by the state—the US government tapping your phones, or watching you on camera, or scanning your face in an airport. What Lockport and Taylor Swift are doing is almost the opposite—choosing to surveil their fans and students in the name of safety, precisely because the state won’t step in to help. Perhaps because they have no faith that the state ever will. "I feel like I can't go anywhere and feel like I'm safe. I walk into a library, I don't feel safe. I walk into a yoga studio, I don't feel safe. Walk into a bar, don't feel safe," FSU student Ellie Gensch told WCTV after a shooting in California.
Polling suggests that in the past few years, Americans have become more worried about safety and less worried about privacy. Researchers are hesitant to ascribe any one cause to shifts like this, but I don’t think it’s absurd to suggest that this shift is at least in part due to the continued string of mass shootings in the United States. Amid a backdrop of constant violence, what’s a little light surveillance? The threat of ubiquitous surveillance seems less real to many people than the threat of being gunned down at work, a bar, the office, school, a hospital, a concert, or, really, anywhere.
And companies recognize this feeling. The facial recognition systems being sold to schools across the country are capitalizing on it. Nest cameras, Ring doorbells, companies like Flock Safety that sell outdoor security cameras to homeowners associations. “We’ve seen that there are tech firms that are preying on the fears of school districts to get them to purchase technology that will not work and they do not need,” says Stefanie Coyle, education counsel at the New York Civil Liberties Union. This is a booming area of business because, even while these cameras don’t necessarily make anybody actually safer, they make people feel like something is at least being done in the face of all this danger. And the “something is better than nothing” line used to justify these surveillance systems is a dangerous line of reasoning, not just because it erodes privacy, and not simply because it offers a false sense of security, but also because it disproportionately targets nonwhite bodies.
Surveillance systems like the ones set up in Lockport and at the Rose Bowl rely on having “known actors” to identify—they look for faces in a database. Those databases have to come from somewhere, and are usually populated by mugshots from law enforcement. In the case of Lockport, the system will supposedly be “used to alert school officials if anyone from the local Sex Offenders Registry enters a school or if any suspended students, fired employees, known gang members or an affiliate enters a school.” When I asked KC Flynn, head of SN Technologies, which makes the Lockport system, where those “known gang members” might come from, he told me that it was up to individual schools to load those in.
Whatever databases these systems use, they almost certainly come with bias baked in. “As long as facial recognition, or gait analysis, or any other surveillance tech is trained, deployed, and operationalized in and through a society which is more likely to designate and view black and brown bodies and cultural markers as violent or suspicious, and white bodies and white Western cultural markers as ‘normal,’ then the people likely to be targeted by these systems are those who ‘fit the description,’” Williams says. In other words, the systems people are turning to are far more likely to tag a black man as “dangerous” than a white one, even though white men have committed more mass shootings than any other group. If the point of these technologies is to protect us from said shootings, these systems are certainly not the tool for the job. And in fact, I would argue that they probably are worse than nothing.
If you enjoyed our content, we'd really appreciate some "love" with a share or two.
And ... Don't forget to have fun!