Scranton, Pa

Built for Connection, Vulnerable to Exploitation: The Case of Discord

Posted: February 23, 2026

Discord began as a communication tool for gamers. Today, it has become a general-purpose messaging platform used by over 200 million people per month, including nearly 30% of teens. Despite the platform’s Terms of Service requiring users to be at least 13 years old, its self-reporting age verification can allow younger users to gain access. While Discord makes it easy to build communities and communicate in real time, those same features can also create opportunities for harm. In particular, Discord’s design choices have made it a powerful tool for grooming and sharing child sexual abuse material (CSAM), and Discord has failed to implement effective safeguards.

In the past year, several U.S. states and families have filed lawsuits alleging that Discord enabled adults to contact, groom, and sexually exploit children. In North Carolina, a family filed suit against Discord, claiming that a 20-year-old man targeted their 13-year-old daughter. The man allegedly first targeted the child on the gaming platform, Roblox, before moving the conversation to Discord, where he coerced her into sending explicit images of herself, sent her CSAM, and pressured her to harm herself. New Jersey Attorney General, Matthew Platkin, brought an action against Discord in April 2025 under the state’s Consumer Fraud Act, alleging the platform’s default settings and lack of meaningful age verification exposed children to sexual exploitation misleading parents about Discord’s ability to keep minors safe. These cases highlight an issue that has been ongoing for years. In fact, the National Center on Sexual Exploitation (NCOSE) has placed Discord on its “Dirty Dozen” list for four consecutive years, citing the platform’s role in facilitating sexual exploitation and abuse.

Discord’s design makes it particularly well-suited to conceal and perpetrate exploitation. The platform’s invite-only servers are invisible to outsiders, which allows harmful conduct to go undetected unless someone reports it. Direct messages and private servers provide predators even more privacy to isolate minors and accelerate grooming in spaces with no meaningful oversight. This lack of safeguards is especially problematic when predators use youth-oriented gaming platforms like Roblox as a gateway, where the majority of users are 16 and under, and age verification is similarly easy to circumvent. Predators identify and befriend minors through online gaming sites before steering the conversation to Discord. They may use tactics such as voice-altering technology to pose as the victims’ peers or offer in-game currency or emotional validation to manipulate minor victims. Many cases follow a similar pattern: initial contact is made on a gaming site, the conversation migrates to Discord, then private messaging enables the coercion, manipulation, and sexual exploitation of minors.

While Discord claims to prohibit sexual exploitation, enforcement is largely reactive and dependent on victims or their families to recognize and report the harmful conduct. This system is often ineffective at preventing child sexual exploitation, as grooming often begins with seemingly benign interactions that are not recognized as predatory conduct until exploitation has already advanced. Although Discord offers direct message filtering options for minors, it places teen users in a default setting that does not screen communications from users labeled as “friends,” requiring families to manually opt into stronger protections. This structure effectively transfers responsibility for safety from the platform to minors and their parents, who must recognize risks and navigate complex settings on their own. Moreover, because grooming often involves rapidly building trust and familiarity, predators may obtain “friend” status early on in an interaction, limiting the effectiveness of the filtering protections.

Discord also relies on moderators appointed by server administrators themselves to prevent harmful conduct in its servers. However, moderators and administrators are ordinary users, and Discord merely encourages moderation without requiring every server to have dedicated moderators. Not requiring moderation or training for moderators can allow abuse to go unchecked unless someone reports it or it is detected by Discord’s AI moderation system, which is not always reliable in its assessment of content. Lawsuits against Discord have highlighted its lack of prevention, with complaints alleging that Discord has long been aware of how its platform is used but continues to market itself as safe for teens without implementing meaningful safeguards.

Online platforms have a responsibility to implement safeguards to prevent exploitation, which requires more than content takedowns or reporting after harm has occurred. Meaningful reform would include default safety settings for minors, stronger age verification, limits on direct messaging between adults and minors, and professional moderators trained to detect grooming behaviors. The vulnerabilities in the structure of Discord are known to predators, enabling harm at scale. Until platforms are required to address how their design can facilitate exploitation, private servers will continue to enable public harm.

This piece is part of our first-year law student blog series. Congratulations to author Schuyler Gebhardt on being chosen!

All views expressed herein are personal to the author and do not necessarily reflect the views of the Villanova University Charles Widger School of Law or of Villanova University.

Category: News

« Back to News
  • Learn More About The CSE Institute

    We welcome contact from organizations and individuals interested in more information about The CSE Institute and how to support it.

    Shea M. Rhodes, Esq.
    Director
    Tel: 610-519-7183
    Email: shea.rhodes@law.villanova.edu

    Prof. Michelle M. Dempsey
    Faculty Advisor
    Tel: 610-519-8011
    Email: dempsey@law.villanova.edu

    Contact Us »