Pulling Back the Veil: Exposing Pernicious Uses of Facial Recognition Technology


Facial recognition is nothing new. Technology giants have been developing and implementing facial recognition for years; our iPhone lock mechanisms are proof of that. The potential uses for facial recognition are limitless and many companies already wield the capability to create powerful tools. Several countries, enticed by the promise of such tools, have jumped at the opportunity to use facial recognition to bolster policing and security forces. The Chinese government serves as a primary example. However, even in the United States, private companies have sold novel artificial intelligence (“AI”) technologies to law enforcement agencies who have leveraged them to surveil a largely unsuspecting American public. The primary blockade for such companies in implementing their technology is privacy. Several facial recognition-based software companies are currently under legal fire for privacy violations. In many of these cases, courts have legitimized privacy concerns by refusing to grant dismissal actions. This year, Apple announced that it will begin scanning personal photo libraries on Apple devices to crack down on child abuse and pornography. Both domestic and international digital rights groups have raised concerns about shrinking privacy rights and the general public’s lack of control on the matter. For a company like Apple that has distinguished itself with a platform tailored to keep personal information private, a decision to roll-out an invasive security feature without an opt-out option is bold.This move comes with a tradeoff. Customers must now choose between the peace of mind that their data is for their eyes only (and whoever they choose to share it with) and the very different peace of mind that minors and other vulnerable technology users are protected against bad actors. This note attempts to answer some of the questions that executives at Apple likely mulled over. What is possible with facial recognition? What are the privacy concerns of the general public? What should society’s privacy concerns be? Most importantly, how much autonomy are consumers willing to sacrifice in the name of safety and security, efficiency, and a heightened e- commerce experience?


Privacy, Technology, Facial Recognition, China, Artificial Intelligence, facial recognition-based software, data, National Surveillance



Christopher Kim (Washington University in St. Louis)



Publication details



All rights reserved

File Checksums (MD5)

  • PDF: cf49d725fdc6d5872bdf8e5f2df3e743