Surveillance Is a Poor Substitute for Doing the Real Work of Crime Prevention

Biometric data captured in surveillance footage, especially of innocent bystanders, may cause far greater harm to all of us in the future.
Surveillance Is a Poor Substitute for Doing the Real Work of Crime Prevention
Artificial Intelligence security cameras using facial recognition technology are displayed at the 14th China International Exhibition on Public Safety and Security at the China International Exhibition Center in Beijing on Oct. 24, 2018. Nicolas Asfouri/AFP via Getty Images
Shannon Edwards
Updated:
0:00
Commentary
Over the past few weeks, cities with sky-high crime rates and embarrassing levels of recidivism have trotted out shiny new surveillance tools and technology intended to create the impression that elected officials are doing the work of fighting crime. But what these announcements—and the subsequent coverage—overlook is how the biometric data captured in surveillance footage, especially of innocent bystanders, may cause far greater harm to all of us in the future.
For California Gov. Gavin Newsom, the announcement was of 480 “high-tech” cameras to be added to city streets and freeways in Oakland and the East Bay. In a city plagued by a 38 percent increase in robbery from 2022 to 2023 (with 50 percent of those robberies conducted with a firearm) and nearly as many cars stolen in the city of 430,000 than in all of New York City in 2023 (14,826 versus 15,795), it may seem a reasonable effort. But further down in the statement, the truth begins to rear its head, specifically, that the governor is also sending prosecutors to Alameda County to “assist” in prosecutions—a seemingly tacit admission that watching video of criminals racing away from a crime scene is far less effective than putting them in jail.
And in New York, Mayor Eric Adams has touted in recent weeks the start of a pilot facial-recognition program for bodegas (first announced last year), as well as the addition of metal detectors to subway stations around a city blighted by transit crime. But again, the truth is hard to avoid with the mayor also highlighting the city’s recidivism problem, noting in a recent interview that 38 people who assaulted transit workers this year had a combined total of 1,126 previous arrests, and 542 people arrested for shoplifting were nabbed 7,600 times previously for the same offense.
On the surface, of course, it may seem clever to utilize new technology in the fight to support small-business owners exhausted by theft, abuse, and economic ruin or to appease residents anxious about unpredictable violence in their neighborhoods. But when you consider the recidivism numbers alone—capturing biometric data, including that of bystanders, seems a reckless gamble.
Biometric data is incredibly valuable and, of course, used to open our phones, verify our identity when banking, prove immigration status, and bypass long security lines at airports. Just the simple facts of these sensitive uses should cause alarm bells to ring when we hear of casual biometric collection by any public or private entity.
Facial recognition innovation is big business and is being used for everything from securing account access to identifying employees, confirming the age of children (as part of social media legislation popping up across the United States), and even offering store owners the opportunity to sweep in and care for VIPs who walk through their doors unannounced (a feature of Amazon Web Services “Rekognition” product).

This is not to say that facial recognition technology may not create applications that are useful and meaningful to all of us. But the value of our data means that we should not be handing it over so easily and should be appalled that it is being taken without debate or our full consent.

We only need to look at China to see how surveillance can not only limit freedom but also contribute to the lucrative creation of facial recognition technology. Our government has warned for years of the risk that video and image data collection poses, especially as it relates to the creation of “deep fakes.” A report that Homeland Security released in 2021 is chilling in its detail of the various potential uses.
There is some good news here, as surveillance—and the dangers of facial recognition technology—is an issue that unites Americans across the political divide. Some of the most ardent objectors to surveillance technology are from groups that are as far left as they are right. And while organizations such as the American Civil Liberties Union often focus rightly on the inequitable use of facial recognition technology as it relates to crime and unfair racial targeting, the issue is far more expansive, and it would be useful for all organizations in opposition to make this clear.
Even our national intelligence agencies have warned that U.S. adversaries see the data of Americans as a “strategic resource” that can be weaponized. So it stands to reason that we should be protecting our information and taking the implications for its use far more seriously. And right now, surveillance for the purposes of unproven theater versus good public safety policy doesn’t pass muster.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.
Shannon Edwards
Shannon Edwards
Author
Shannon Edwards is an entrepreneur, consumer technology trends and policy expert, digital marketer, and journalist. She has led startups globally and has served for years as a media go-to on global tips and trends, and consumer advocacy.