GALVESTON COUNTY, Texas – A federal lawsuit has been filed against the platforms Discord and Roblox on behalf of a Galveston County girl who was sexually assaulted by a man who groomed her using the platforms.
The lawsuit, filed by Anapol Weiss, comes after several similar lawsuits against the platforms in the cases of a 13-year-old New Jersey boy and a Florida girl.
Recommended Videos
Related Story: Smartphone apps every parent should keep an eye on
“In each case, the minor was first groomed on Roblox and later coerced into sharing explicit content through Discord, a progression that mirrors a pattern of predatory behavior playing out across these platforms,” Anapol Weiss said.
In the case of the Galveston County girl, the law firm says the abuse escalated to an in-person meeting at her home, where the predator sexually assaulted the girl and recorded it on his cell phone.
“This young girl has suffered unimaginable harm,” said Alexandra Walsh, shareholder at Anapol Weiss. “Her life has been horrifically altered, and it never should have happened. Roblox and Discord prioritized profits over safety.”
Related Story: Gaming streamer, OnlyFans model ‘Amouranth’ held at gunpoint for Crypto, pistol whipped in Houston home invasion
The lawsuit alleges both Roblox and Discord knew of the dangers posed by their platforms and failed to implement the safety mechanisms necessary to prevent exploitation. Despite marketing themselves as child-friendly and safe, the companies allegedly prioritized user growth and engagement over meaningful protections.
Related Story: Conroe ISD kindergarten teacher arrested for online solicitation of a minor
“Tragically, what happened to her is far from an isolated event,” said Anapol Weiss shareholder Kristen Feden. Anapol Weiss is currently investigating hundreds of similar cases involving minors who were targeted and abused through these online platforms.
“This lawsuit is a demand not only for justice, but for systemic change,” said Davis Cooper of Cooper Masterman. ”With this lawsuit, our goal is to safeguard children by holding Roblox and Discord responsible for the dangers they’ve created and compelling them to implement user safety as a core priority.”
Government authorities are also taking action. The New Jersey Attorney General’s Office recently filed a consumer protection lawsuit against Discord, describing the platform as a “hunting ground for predators.” That legal action accuses Discord of knowingly allowing child sexual abuse material (CSAM) to be exchanged on its servers and failing to take appropriate steps to remove it or report it in a timely manner. Simultaneously, the State of Florida has issued a subpoena against Roblox as part of its own investigation into platform safety and corporate accountability.
Related Story: General counsel for Houston Livestock Show and Rodeo accused of online solicitation of minor
“These interventions reflect a broadening consensus that the current regulatory environment is insufficient to protect children online—and that industry self-policing has failed,” said Walsh. “Swift action and oversight is needed to ensure that platforms popular among children are no longer havens for predatory behavior.”
Discord sent a statement on the lawsuit:
“We can’t comment on ongoing litigation and will respond appropriately through the legal process. Discord is deeply committed to the safety of our users and we take decisive actions when we detect violations of our policies, including removing content, banning users, shutting down servers, and engaging with law enforcement as appropriate.”
- We work hard to keep content that violates our policies off our platform and take action when we see it, which can include banning users, shutting down servers, and reporting violative accounts to NCMEC, which in turn works with local law enforcement.
- We use a mix of proactive and reactive tools to take action against content and activity that violates our Community Guidelines, from the use of advanced technology like machine learning models and PhotoDNA image hashing, to empowering and equipping community moderators to uphold our policies and providing in-platform reporting mechanisms to surface violations.
- Our multi-faceted and holistic approach to safety includes:
- Safety tools and resources that give users, parents, and educators control to customize their Discord experience for themselves or their teens.
- Our Teen Safety Assist initiative includes features specifically designed to help protect younger users. This includes features such as multiple safety alerts and sensitive content filters that are default enabled for teen users.
- Our Family Center tool helps parents and guardians to learn more about how their teen spends their time on Discord and about the groups they are a part of. Family Center helps foster productive dialogue about safer internet habits, and to create mutually beneficial ways for guardians and teens to connect about experiences online.
- Our Guardian’s Guide empowers parents and guardians to understand Discord and how to use our features to set expectations and start conversations with teens about how they hang out online, form connections, and develop positive behaviors.
- We have settings that give users control over their experience on Discord including who they communicate with, what content they see, and what communities they join or create.
- Proactive detection and actioning of high-harm content.:
- Image hashing and machine-learning powered technologies that help us identify known and unknown child sexual abuse material.
- Industry collaboration:
- Discord developed a new model for detecting novel forms of CSAM and has made this model open source, because we believe technology that improves online safety should never be one company’s competitive advantage, but should be shared for the common good.
- As part of our industry-wide efforts within the Tech Coalition, Discord supports the Lantern initiative, a first-of-its-kind signal sharing program for companies to enhance and strengthen how they detect attempts to sexually exploit and abuse children and teens online.
- We’re a founding member of ROOST, a cross-industry non-profit that addresses a critical gap in child safety and digital safety by providing free, open-source safety tools to organizations of all sizes.
- Partnerships with industry-leading organizations like the Family Online Safety Institute, Technology Coalition, National PTA, Digital Wellness Lab, INHOPE, and more.
- Safety Reporting Network that allows trusted partners to surface content and reports of violations directly to our Safety team.