World

Nebraska becomes the latest to sue Roblox alleging child safety failures

Nebraska became the latest state to sue Roblox on Wednesday, alleging that the popular gaming platform exploits children and misrepresents its safety practices.

The complaint, one of many similar lawsuits filed by at least six other states, accuses the company of allowing direct contact between minors and adult predators, as well as sexually explicit, violent and otherwise age‑inappropriate content.

Since it launched in 2006, Roblox has evolved into a sprawling social gaming ecosystem with a vast catalog of virtual user-made “experiences,” most of which include real-time messaging. The platform reports having more than 151 million daily active users, and it described itself in its latest shareholder letter as “the largest dedicated gaming platform for users aged 13 and under.”

Nebraska’s lawsuit, which Attorney General Mike Hilgers filed in Adams District Court, says sexual predators have “repeatedly used Roblox to groom and eventually abduct and sexually assault children.”

“For years, Roblox has known that it has a pedophile problem,” the lawsuit says. “It built an extremely popular interactive gaming and social media platform by doing what other social media companies would not: marketing to and accepting pre-teenage users. As a result, Roblox became, in its own words, the ‘#1 gaming site for kids and teens.’”

Here’s what else to know about the growing concerns surrounding Roblox.

What Roblox has said about the allegations

Matt Kaufman, Roblox’s chief safety officer, pushed back against the claims in the latest Nebraska lawsuit, writing in an email that the platform “is built with safety at its core, and we strengthen our protections every day.”

“While we share Attorney General Hilgers’ commitment to keeping kids and teens safe online, we are disappointed that he has filed a lawsuit that fundamentally misrepresents how Roblox works,” Kaufman wrote.

Roblox prohibits image and video sharing between users, uses “rigorous filters” to prevent sharing personal information and enforces age-based settings allowing younger users to chat only with peers, he wrote.

Nebraska becomes the latest to sue Roblox alleging child safety failures

The company also works closely with law enforcement to “support investigations and help hold bad actors accountable,” Kaufman said.

Roblox began requiring all users this year to undergo facial age checks via the third-party identity verification company Persona to use the chat function. Users are then permitted to chat with other users in their age groups.

Roblox also began offering parental controls in late 2024, enabling parents to block specific games or users, choose whether their children can chat with others and set screen time and spending limits.

Though parents need Roblox accounts to manage the full suite of controls for their children, the company says that unless a parent enables the function, children under 13 are automatically blocked from direct messaging and children under 9 are automatically blocked from in-game chats.

Roblox has rolled out updates to make its safety protocols more robust in recent years, but Nebraska’s and other states’ lawsuits argue that current measures are not enough.

“Even with parental oversight settings enabled, parents continue to lack visibility as to who their child is messaging and what the messages say, leaving unaddressed Roblox’s fundamental deficiency that facilitates grooming and predation on children—adult access to and communication with children,” Nebraska’s complaint says.

Roblox faces additional litigation and scrutiny

Aside from Nebraska, states that have sued Roblox on issues of child safety include: Louisiana, Kentucky, Texas, Florida, Iowa and Tennessee. Los Angeles County also filed its own lawsuit last month, claiming Roblox fails to protect children from predators and inappropriate content.

Roblox is also at the center of dozens of individual lawsuits that have been consolidated into multidistrict litigation in U.S. District Court for the Northern District of California. As of early this month, more than 130 lawsuits have joined the group litigation.

Such lawsuits represent the cases of children around the country who are alleged to have been groomed by predators on Roblox, including cases in which victims died by suicide or were sexually assaulted in person after they were lured from their homes.

Nebraska man arrested in Florida Roblox kidnapping case

00:0000:00

Dozens of people in the U.S. have been arrested on charges of abducting or sexually abusing children they had groomed on Roblox, according to Bloomberg, which compiled data on the arrests in 2024.

Despite the platform’s chat restrictions and its ban on user-to-user image and video sharing, multiple lawsuits have detailed how predators used Roblox to lure kids onto Discord, where they could message more freely. Several lawsuits name Discord as a defendant alongside Roblox.

(Discord previously told NBC News that it does not comment on legal matters but said it is “deeply committed to safety.”)

Roblox has also struggled to regulate inappropriate user-made games. Sex games, popularly referred to as “condo” games, occasionally pop up on it despite being strictly prohibited.

Multiple lawsuits against Roblox have also pointed to games like “Escape to Epstein Island,” “Diddy Party” and “Public Bathroom Simulator Vibe,” which allowed users to simulate sexual activity in virtual bathrooms.

Oklahoma and South Carolina have also initiated investigations into Roblox, paving the way to potentially file their own lawsuits.

‘Families want their cases to be heard in court’

Last month, more than 800 parents from 48 states sent an open letter asking Roblox’s and Discord’s boards of directors to stop trying to force lawsuits into private arbitration, which would take the cases out of court into a confidential process involving a neutral third party.

“Secret arbitration only protects corporations, not kids,” Pat Huyett, partner at Anapol Weiss, the firm coordinating the letter, said in a statement. “Families want their cases to be heard in court so issues can be addressed openly in order to prevent this from happening to more children.”

In late February, San Mateo County Superior Court in California allowed a case to “proceed in open court rather than confidential arbitration,” Anapol Weiss said in a news release. The case involves “the grooming, kidnapping, and sexual assault of a 10-year-old girl.”

The decision could have broader implications for how arbitration clauses are applied in cases involving minors. The ongoing lawsuits, none of which have gone to trial yet, have the potential to reshape the immunity that platforms have traditionally enjoyed under Section 230 of the Communications Decency Act of 1996.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button