
By Movieguide® Contributor
The National Center of Sexual Exploitation is warning parents against popular chat website Discord.
“Discord proudly touts 150 million monthly active users, but what they don’t want to brag about is their four-year residency on the Dirty Dozen List,” NCOSE reported.
Discord is a popular platform among predators, who use the messaging site to coerce minors into sending sexually explicit images, as well as connect with other predators to trade CSAM (child sexual abuse material). The website is being used to create and trade AI-generated pornography.
So, why have Discord’s safety measures not done anything about the rampant spread of CSAM and other pornography on its site?
“Even though Discord claims to have made changes to prevent exploitation, these policies are merely performative,” NCOSE reported, saying that it and other child safety experts “have proven these safety changes to be defective.”
For example, Discord’s “Teen Safety Assist” feature is supposed to automatically blur sexually explicit content. However, when NCOSE was testing the feature, they were still able to send un-blurred sexually explicit content to an unconnected teen account.
“NCOSE reached out to Discord multiple times in late 2023 offering to share our test results of their new ‘safety’ tool for teens and send them evidence of activity that showed high indicators of potential CSAM sharing,” it reported. “We never heard back.”
The organization added that Discord’s CEO, Jason Citron, “has continuously lied to Congress and the American people,” pointing to a 2024 US Senate Judiciary Committee Hearing where he testified that “safety is built into everything we do [at Discord].”
“We have a zero tolerance policy on child sexual abuse material or CSAM. We scan images uploaded to Discord to detect and block the sharing of this abhorrent material,” Citron testified.
Related: Landmark Lawsuit Could Take Down AI Porn Industry — Here’s How
He also claimed that Discord’s “Teen Safety Assist” feature works — despite NCOSE proving that it does not.
“Tech companies frequently declare that they value the safety of their users. However, when this lack of safety is what’s making them money, suddenly, safety doesn’t seem so important to them anymore,” NCOSE concluded.
Discord is also currently facing a lawsuit that alleges it and popular gaming website Roblox facilitated the sexual exploitation of children online.
In the lawsuit, filed in San Mateo County, the 13-year-old male plaintiff claimed he was targeted by a man who “was already facing criminal charges for sexually exploiting another child and [authorities] now believe that he exploited at least 26 other children,” per CBS News.
The suit also alleges that the plaintiff’s father allowed him to use these sites “because he trusted Defendants’ representations that their apps were safe for children to use.”
Read Next: Roblox Isn’t Safe for Your Kids. In Fact, It’s a ‘Pedophile Hellscape’