Reddit is a social media platform that’s been home to many a viral moment, along with brilliant AMA (Ask Me Anything) conversations — but it’s also known as a place where MRAs and other disturbed harassers like to hang out. Now, the site has taken its first steps to ban abusive subreddits that promote hate.
By hiring back Alexis Ohanian and appointing interim CEO Ellen Pao, a new feminist tech heroine, the company had previously signaled that its days as a friendly home to hate were largely over. Reddit had already banned images that were sexually explicit, and in May, it banned harassment, which the site defined as following:
Systematic and/or continued actions to torment or demean someone in a way that would make a reasonable person (1) conclude that reddit is not a safe platform to express their ideas or participate in the conversation, or (2) fear for their safety or the safety of those around them.
The company’s blog post defining harassment stated that the rules referred to attacks on “people, not ideas.” Today, it followed through on that promise by announcing the shuttering of five subreddits whose deeply offensive names (seriously, be warned) describe their content: neofag, shitniggerssay,transfags, hamplanethatred, and fatpeoplehate, the last of which had already attracted a good deal of ire. In today’s announcement, Reddit made it clear that other bans could follow:
“We want to be open about our involvement: We will ban subreddits that allow their communities to use the subreddit as a platform to harass individuals when moderators don’t take action. We’re banning behavior, not ideas.”
But let’s note that, at least for now, they’re not getting banned for that. These so-called free speech warriors want their site to be an unrestricted stomping ground for all ideas, even the abhorrent ones. But John Stuart Mill’s idea holds fast on the Internet: one person’s freedom reaches its boundary when it directly impinges on another person’s freedom. And if someone cannot sign on to Reddit because of the flourishing of hate groups, then they have no freedom of speech on the platform. Now, certainly everything offensive or mean or ignorant or bigoted can’t be taken down. But places that exist solely to target individuals, with no accountability, don’t need to be given a home. I’ve received my share of Twitter harassment and I can’t think of anyone worth banning, but the Chuck Johnsons of the world do deserve to be booted after they start using social media as a forum for their ominous threats.
There are multiple issues that make this step imperfect. Of course, it’s going to be crucial for Reddit to maintain constant transparency around exactly what constitutes harassment and have a clear, uniform rule so that the impression of favoritism or of favoring one cause over another doesn’t exist. It’s also true that the unrepentant fat-shamers, racists, rape apologists, and general haters will find somewhere else to go now that their playground has been shut down, or at the very least given some rules. But social media spaces don’t exist to rid the world of its evil people. They exist to give the majority of their users a safe enough space so they can benefit from the service without feeling directly harassed. And taking the step of banning these poorly moderated hate forums is largely a good one.
In some ways, the issue facing Reddit is one that’s affected many segments of life, online and off: free speech vs. curtailing harassment and threats. But in this case, it’s not disturbing ideas that are being banned but active harassment. With Gamergate and rampant online harassment already driving some people to change addresses or fear for their lives, the time to think seriously about these issues online was probably yesterday.