Snapchat is a primary platform for online predators who use “sextortion” schemes to coerce minors into sending graphic images and videos of themselves and then use the explicit material as blackmail, according to a lawsuit filed by New Mexico.

Attorney General Raúl Torrez announced Thursday that his office has taken legal action against Snapchat parent company Snap following a months-long investigation.

Torrez alleged that Snapchat — a photo-sharing app wildly popular with teens and young users that is known for messages that disappear in 24 hours — has policies and design features that facilitate the sharing and distribution of child sexual exploitation material.

“Snap has misled users into believing that photos and videos sent on their platform will disappear, but predators can permanently capture this content and they have created a virtual yearbook of child sexual images that are traded, sold, and stored indefinitely,” Torrez said in a statement.

New Mexico Attorney General Raúl Torrez announced a lawsuit against Snapchat parent company Snap.

The investigation by New Mexico’s Department of Justice used a decoy Snapchat account impersonating a 14-year-old girl named “Heather,” who proceeded to exchange messages with an account called “child.rape” and others with explicit names.

Investigators also found 10,000 records related to Snap and child sexual abuse content on dark websites, saying that Snapchat was “by far the largest source of images and videos among the dark web sites investigated.”

A spokesperson for Snap said the company “received the New Mexico Attorney General’s complaint” and was “reviewing it carefully.”

Snap “will respond to these claims in court,” the spokesperson told The Post, adding: “We share Attorney General Torrez’s and the public’s concerns about the online safety of young people and are deeply committed to Snapchat being a safe and positive place for our entire community, particularly for our younger users.”

The company rep said it has been “working diligently to find, remove and report bad actors, educate our community, and give teens, as well as parents and guardians, tools to help them be safe online.”

“We understand that online threats continue to evolve and we will continue to work diligently to address these critical issues. We have invested hundreds of millions of dollars in our trust and safety teams over the past several years, and designed our service to promote online safety by moderating content and enabling direct messaging with close friends and family,” the spokesperson said.

The lawsuit alleged that Snapchat is a primary tool used by online predators to commit “sextortion” against teens.

Last December, New Mexico filed suit against Meta Platforms alleging that its social media networks Facebook and Instagram failed to protect underage users from adult sex content and disturbing messages from alleged child predators — including “pictures and videos of genitalia” and six-figure offers to star in porn movies.

State investigators used a similar playbook in investigating Meta. They set up test accounts on the Meta-owned social media sites for four fictional children using AI-generated photos that purportedly portrayed children age 14 or younger.

“Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey,” according to the complaint, which blasted the tech giant founded by Mark Zuckerberg as engaging in conduct that was “unacceptable” and “unlawful.”

Tech firms have come under fire for failing to protect minors against online predators.

The company has argued that it makes extensive efforts to protect young users from harm.

“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” Meta said in a statement.

In July, Meta said it removed about 63,000 Instagram accounts in Nigeria that were attempting to engage in “sextortion” scams, mostly aimed at adult men and some children in the US. 

Within those 63,000 accounts, Meta said it identified a network of 2,500 accounts run by a group of 20 individuals who “targeted primarily adult men in the US and used fake accounts to mask their identities.”

Additional reporting by Thomas Barrabi and Taylor Herzlich

Share.

Leave A Reply

Exit mobile version