Discord has become the subject of a new lawsuit filed by the state of New Jersey, alleging that the chat platform engages in “deceptive and unconscionable business practices” that endanger younger users.
This legal action, submitted on Thursday, follows a multi-year inquiry by the New Jersey Office of Attorney General. The Attorney General’s office claims it has found evidence suggesting that, despite Discord’s stated policies for protecting children and teens, the app exposes youth to harm.
Attorney General Matthew Platkin disclosed to WIRED that New Jersey is the first state to pursue legal action against Discord. Platkin identified two main incidents that prompted the investigation. The first was personal: several years ago, Platkin learned that a family friend was shocked to find his 10-year-old son could register on Discord, even though the platform prohibits users under the age of 13.
The second catalyst was the mass shooting in Buffalo, New York. The shooter reportedly used Discord to document his plans leading up to the attack and streamed the event live on the platform, although the footage was promptly removed.
Platkin criticized companies for prioritizing profit over the welfare of children. According to the lawsuit, Discord has violated New Jersey’s Consumer Fraud Act by failing to enforce its policies intended to protect users under 13 from sexual exploitation and violent content. This lawsuit adds to a series of legal challenges from various states against major social media companies, which have so far struggled to achieve significant results.
Discord’s safety policies explicitly prohibit users under 13 on the platform and aim to prevent sexual interactions with minors, including youth self-endangerment. The company employs algorithmic filters to block unwarranted sexual messages. Discord, headquartered in California, published a safety policy in 2023, highlighting its commitment to creating a fun and secure space for teens.
However, New Jersey challenges the effectiveness of Discord’s promises. The attorney general highlights that Discord offers three safety settings to protect youth from inappropriate and exploitative messages: “Keep me safe,” where all incoming messages are scanned; “my friends are nice,” where messages from friends are not scanned; and “do not scan,” where no messages are scanned.
For teenage users, the lawsuit contends that the platform defaults to the “my friends are nice” setting, which the attorney general argues poses a risk to younger users. Furthermore, the lawsuit criticizes Discord for not implementing age verification procedures to prevent children under 13 from accessing the platform.
In 2023, Discord introduced additional filters to identify and block inappropriate sexual content. However, the Attorney General’s office contends that the company should have opted to make the “keep me safe” setting the default.