Before it shut down on Thursday, Omegle — a platform that paired strangers in video chats — paired a lot of children with predators.
It did so for more than a decade, drawing scores of lawsuits over alleged child grooming on the app. In one case, still being investigated by federal authorities, a Norwegian claimed she’d met a man on Omegle when she was just 14 and that the relationship led to her physical abuse. In another, from 2022, the FBI investigated a tip from a user who’d been paired with a stranger who asked if she wanted to see child sexual abuse material—a stash of 20 illegal videos he said he got “from trading on Omegle.” The stranger later pled guilty to possessing child sexual abuse material (CSAM) and was sentenced this October to 42 months. That same year, Omegle reported more than half a million cases of CSAM to the nonprofit National Center for Missing and Exploited Children, which passes on tips to U.S. law enforcement. That was more than the volume found on other major sites like TikTok, Snapchat and Discord.
Such rampant criminality, alongside related lawsuits and criticism from child protection organisations, led to Omegle founder Leif K-Brooks to announce Wednesday he was shutting down the app. He did not respond to requests for comment, but in a swan song posted to the website Thursday, said: “Omegle punched above its weight in content moderation, and I’m proud of what we accomplished.”
It’s been 14 years since Brooks launched Omegle as an 18-year-old. The site has long operated under the tagline “Talk To Strangers,” promising to provide a safe environment for random users to connect and chat. Though his memo Thursday admitted that Omegle had hosted criminality. “There can be no honest accounting of Omegle without acknowledging that some people misused it, including to commit unspeakably heinous crimes,” he wrote. But he added he had been unduly attacked over the policing of its site—to the point where the “stress and expense” were “simply too much.” (Brooks was named to Forbes 30 Under 30 list for his work on a different company, Octane AI.)
It was not immediately clear whether the shutdown was related to an ongoing case against Omegle, for which a decision is due out any day. In the suit, filed in 2021, a 13-year-old named C.H. alleges she was sextorted by abusers she met on Omegle when she was just 11. She used the app for the first time during the pandemic on a computer given to her by her school, one of her lawyers, Hillary Nappi of Hach Rose Schirripa & Cheverie LLP, told Forbes on Thursday. On the girl’s second click, the app allegedly paired her with a person who pressured her to perform sexual acts, threatening her family if she refused, according to Nappi.
“It took me five minutes to tell my mom what Omegle did to me,” C.H. said this week through representation at Hach Rose Schirripa & Cheverie and Marsh Law Firm. “It took you [Brooks] multiple years and countless other victims before you would even speak up about the online Frankenstein you created, and you still only care about yourself.” Brooks didn’t respond to the allegations. Omegle’s lawyers had argued that C.H.’s case should not have been heard because the app was protected by Section 230, which provides some protections to tech companies from legal action where users post illegal content. A judge denied Omegle, however, saying that its matching system did not fall under Section 230, and the case proceeded.
Her lawyer at Marsh Law Firm, Margaret Mabie, said that while the child abuse problems plaguing Omegle can be found on most any other platform, the legal team had discovered “a whole community of pedophiles that make branded videos of their experiences pairing with kids on Omegle and creating sexual material.”
C.H. “was so relieved and so grateful that that website was shut down,” Nappi said of her client, but “she was not the first child, and she was not the last” exploited through this platform.
Announcing the app’s shuttering on Thursday, Brooks wrote that it had invested heavily in countering child exploitation. As well as having humans review content, he said the venture used “state-of-the-art AI” to look for and eradicate CSAM, ultimately reporting it to NCMEC.
But critics said those efforts didn’t go far enough. The Canadian Center for Child Protection noted that when Omegle introduced age verification last year it required only the click of a box to confirm a new user was 18 years old (other sites, including pornography providers like PornHub, do little more). The group also provided Forbes with recent conversations about Omegle discovered on dark web child exploitation websites. In one, a user wrote, “Can’t believe omegle is gone – met a young American girl around 11-years-old a few nights ago on that site.” And it said it had seen hundreds of videos of children as young as 8 who’d been coerced by adult men into performing sex acts on camera.
“Omegle was the perfect storm for online sexual violence against children and youth,” the Canadian group said in a statement. “The site regularly paired children with adult strangers, lacked any meaningful age verification or moderation and continually exposed kids to sexual content. The fact that this site existed in the first place is the real problem.”
Lina Nealon, vice president at the National Center on Sexual Exploitation, said “Omegle’s entire business model was reckless” and that “[its] demise should serve as a warning to other online platforms that companies facilitating sexual exploitation have no business existing.” It pointed to data from Thorn, an anti-CSAM nonprofit founded by Ashton Kutcher, that showed in 2021, one in five Omegle minor users reported having online sexual interaction with an adult. That was compared to one in ten at other major social apps, including Facebook, Instagram, Kik, Snapchat, Tumblr and Telegram.
Mabie, the Marsh Law Firm attorney representing 13-year-old C.H., noted that these problems aren’t Omegle’s alone. “The harms that Omegle caused are still going to happen, even if Omegle isn’t there,” she told Forbes.
“These pedophiles will just find a new place to aggregate online; it’s not going to change the behavior,” she said, citing the possibility of spillover to other platforms like Kik, Instagram, Signal and WhatsApp. “What needs to happen is an industry wide change—we need to start taking this seriously and pass legislation to help enable kids to hold these companies accountable.”