If Discord is serious about protecting the millions of children using their platform and removing the child sexual abuse materials, hardcore pornography, and non-consensually shared pornography from their platform, we encourage the following additional improvements: ![]() And that’s assuming users are on board with self-moderation and properly tagging their NSFW servers, a problem Discord has yet to solve. Click To TweetĪs one Discord representative admitted on Reddit, the company sees this more as a move to remain rated 17+ on the App Store rather than any proactive decision to crack down on the exploitative and abusive content proliferating many servers-all the doors are still there, just with a few more locks. Now, please work on eliminating CSAM and non-consensually shared material on your site. By continuing to allow this content to thrive on its platform, Discord is condoning and potentially profiting from such use-if the rumored sales pitch from Microsoft goes through, Discord stands to make billions of dollars while the men, women, and children being exploited simply move behind an arbitrary age-gate. While we celebrate Discord’s steps to block minors from exposure to pornography and other graphic content, this move does nothing to seriously address the exploitation and abuse happening on the “exclusively pornographic focused” servers, which includes child sexual abuse and objectification, non-consensually shared intimate material, revenge pornography, and more alarming behavior. It remains to be seen whether or not Discord can keep up with the tens of thousands of NSFW servers on the platform-and how Discord will enact and enforce this designation.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |