Facial recognition firm Clearview AI took steps to dramatically expand its surveillance capabilities by attempting to purchase hundreds of millions of arrest records containing sensitive personal information, including social security numbers and mugshots, according to documents reviewed by 404 Media.
The controversial company, already notorious for amassing over 50 billion facial images scraped from social media platforms, signed a contract in mid-2019 with Investigative Consultant, Inc. to acquire roughly 690 million arrest records and 390 million arrest photos from across all 50 U.S. states.
“The contract shows that Clearview was trying to get social security numbers, email addresses, home addresses, and other personal information along with the mugshots,” said Jeramie Scott, Senior Counsel at the Electronic Privacy Information Center, or EPIC.
The ambitious data grab ultimately fell apart, spiraling into legal battles between the two firms. Clearview shelled out $750,000 for an initial data delivery but declared it “unusable,” triggering mutual breach of contract claims. Despite an arbitrator ruling in Clearview’s favor in December 2023, the company hasn’t recouped its investment and now seeks a court order to enforce the arbitration award.
Facial Recognition: Privacy Implications and Algorithmic Bias
Privacy watchdogs warn about the troubling implications of merging facial recognition technology with criminal justice data. Scott pointed out that linking individuals to mugshots and related information can fuel bias among human reviewers using the system. “This is especially concerning given that Black and brown people are overrepresented in the criminal legal system,” Scott emphasized.
Facial recognition systems have repeatedly come under fire for their well-documented failures when identifying people with darker skin tones. The consequences have been severe. Multiple cases across America have seen innocent individuals wrongfully arrested based on faulty identifications from facial recognition technology.
As a digital forensics expert, I have seen facial recognition technology fail firsthand with real consequences. I was retained on a criminal defense case where authorities accused the defendant of using a rental truck to commit a felony. Their entire case hinged on a single facial recognition match from surveillance footage.
In my investigation, I uncovered irrefutable evidence of innocence. Cell phone data placed the defendant miles from the crime scene during the critical timeframe. The technology that triggered his arrest had completely misidentified him.
This wasn’t merely a technical glitch but a life-altering ordeal for someone who faced serious criminal charges based on algorithms that proved unreliable. What’s particularly troubling is how quickly investigators accepted the facial recognition result without pursuing basic corroborating evidence that would have immediately cleared the defendant.
Cases like this reveal the dangerous over-reliance on surveillance technologies within our criminal justice system. When companies like Clearview pursue even larger databases of personal information, they risk amplifying these failures at a scale that could affect innocent people.
Clearview AI: Regulatory Challenges
Clearview AI faces an intensifying barrage of legal obstacles worldwide. The firm recently celebrated a victory against a £7.5 million fine from the UK’s Information Commissioner’s Office, or ICO, successfully arguing it fell outside UK jurisdiction. Yet this represents merely one skirmish in a broader regulatory battlefield.
International regulators have slapped Clearview with multi-million dollar penalties for privacy violations, while the company just received final approval for a settlement that forced Clearview to surrender nearly a quarter of its ownership over alleged violations of biometric privacy laws.
Facial Recognition: Industry Context
Clearview AI’s business model revolves around selling access to its facial recognition technology, primarily targeting law enforcement agencies. The company boasts that its technology has helped crack cases ranging from homicides to sophisticated financial fraud.
While competitors like NEC and Idemia have built their market presence through conventional business development channels, Clearview stands apart, and draws particular scrutiny. This is because of its aggressive approach of scraping billions of images from social media platforms without obtaining consent.
The revelation about Clearview’s attempted acquisition of sensitive personal data arrives as the facial recognition industry faces mounting pressure for regulation and transparency. As this powerful technology increasingly permeates law enforcement and private security operations, fundamental questions about privacy, consent and algorithmic bias continue to dominate public discourse.
Note: The case examples described are based on real events, but names, dates, locations, and specific details have been altered to protect client confidentiality while preserving the essential legal principles and investigative techniques.
404 Media report that, “ICI and Clearview did not respond to multiple requests for comment.” I have also requested comment. This article will be updated accordingly when and if I receive a response.