Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reddit is a crossroads: It's an intersection between the cultures of 4chan (which is in itself an intersection of japanese and american sensibilities), and the culture of usenet, and internet forums, and a dozen other cultures besides.

None of these cultures handle censorship well. They all originated in an environment where, to some extent, you could say whatever the hell you like.

Many of Reddit's early users came from these cultures, and they were responsible for the early culture of the site.

And now, Reddit is desparately trying to adapt itself, and attract people from Twitter, Facebook, and Tumblr, whose cultures are radically different, and perhaps even to some degree less toxic than the pre-September usenet, whilst also being more toxic. I don't know how.

The point is, a culture that previously only dealt with unacceptability in relative terms - this is unacceptable in this context - is now dealing with absolute unacceptability - this is not acceptable, ever. This isn't a change that people will likely adapt to well. This is prompting a migration to sites like Voat, and others.

The problem is, Reddit is introducing censorship which is incredibly inconsistant to a site where the concept of censorship is anathema - Bans, yes, people get punished for breaking the rules. But having your posts quietly vanish without warning?

No wonder the userbase is pissed.

Unless I got it completely wrong, which is possible.



> Bans, yes, people get punished for breaking the rules. But having your posts quietly vanish without warning?

This comment particularly stuck out to me.

Very few other communication platforms of Reddit's ilk will go through the process of shadowbanning users - that is - the user still has full functionality of the site, however their comments are not visible to other users; only themselves. To an unknowing user, it appears no action has been taken on their account.

It is a shockingly effective means of silencing dissenters or those who disagree with the majority; and this punishment has extended far beyond those who speak abusively/offensively. The nefarious part is wasting the user's time as well.


As someone who's run a (small) forum, I completely understand the appeal of shadowbanning. It's incredibly effective if your goal is to make a community a usable place without spending most of your waking hours dealing with trolls and shitbags.

A single person can eat up hours of your time if you attempt to reason with them. They'll create new accounts, they'll use accounts they created a long time ago against this eventuality, they'll make appeals to you and to other users, and they will stir up as much shit as possible.

I've had a person literally call me on the phone to complain that their account was banned.

Shadowbanning means that none of this happens. I check a box, and the collective community can breathe easily, and I can actually sleep. It's an incredible force multiplier.


If only game makers would build this kind of system into their Anti-cheats, make the cheater think for as long as possible they are ruining other people's day only to find out that the last hour has been against AI with canned outrage responses from the AI.


I think most game makers just put cheaters into a cheater-only pool of players.


I like this idea; let's see who has the best and biggest cheats. A comedian once suggested we have the Olympics and we do today, and a no-holds-barred Cheaters Olympics where those competing can take any substance they like, to really see how far the human body can be pushed.

I wonder how long until we have to have a regular and an augmented Olympics.


That sounds like Quake. It isn't quite, but Quake is the closest thing.

Quake's metagame has evolved around extreme dexterity, complicated scripting, and an obsessive desire to use every trick in the book to push youself beyond the game's intended limits. This is why things like the bunnyhop, the rocketjump, and wallrunning are not only accepted, but expected.


I don't know of any game makers that do this.

They probably should, like how Dota 2 will put players who abandon games often into games with other leavers, but most popular games go through "ban waves" to get rid of groups of cheaters at once (Valve has done this many times with CS: GO for example).


> I don't know of any game makers that do this.

Well, apparently Rockstar did this for Max Payne 3 in 2012, and apparently it also applies to GTA-V

http://www.rockstargames.com/newswire/article/35441/taking-a...

http://www.gta5tv.com/gta-v-cheaters-pool-details/

http://kotaku.com/gta-player-says-he-hired-cheater-to-rescue...


Huh, I had no idea GTA5 did that. Thanks for the info, I really would love to see more game developers implement something like this.


Some minecraft servers used to do this. The "griefer" would see blocks being destroyed and placed, but their actions would have no effect on the server. Eventually they would get bored and leave on their own. It was hilarious too.

I think current anti-cheat tool suck. Cheaters are usually obvious, sometimes blatantly flying around, or performing inhuman actions. Simple machine learning should be able to detect cheats with high accuracy.


yes, but then we get an arms race, and that ends with all of the best humans who aren't cheating getting banned as well.


The admins of Reddit have said over and over again that shadowbanning is for spammers, and if non-spammers are getting shadowbanned, it's due to a bug. It was never and is not currently used to silence humans, only robots.


right... /sarcasm


Please follow HN rules when commenting.


I fail to see where you find issue with my comment in reference to the rules.

If you thought I was making a contentless post, sure, I can fix that: plenty of people who weren't spammers have been shadowbanned. Just look it up.


Anyone who is currently shadowbanned and not as a result of bot-spamming can request a removal of the shadowban from the admins and they will happily comply.

The fact that people have been shadowbanned is not evidence that it was intentional.


Okay, so you think I'm wrong. That's fine. But I didn't violate HN rules. I read then twice, just to check.

And while I freely admit that I might be wrong, in this case, it doesn't actually matter. Even if shadowbans are only given to spammers, that's not what the reddit community thinks is happening, leading to the current social climate on reddit.


> that's not what the reddit community thinks is happening, leading to the current social climate on reddit.

I... don't care? I guess you're making some kind of assumption that this conversation needs to know what you think the climate of Reddit's community is. I'd ask "Why" but, I care even less about that.

Shadow-banning is for spammers, full stop. I don't care if you think other people think otherwise.


>I guess you're making some kind of assumption that this conversation needs to know what you think the climate of Reddit's community is. I'd ask "Why" but, I care even less about that.

Why wouldn't I make that assumption? Go read the OP. One of the main points is that shadowbanning, or the belief in it, creates more distrust between reddit's admins and users.


I'm sorry, but the salient point here is that shadowbanning isn't what you're claiming. End of story. I don't know how many ways to make this clearer.


You've made it plenty clear. In turn, I made it clear that I was dubious of your claim, and that it didn't really matter to the point I was making.


It's not a claim, it's a fact. This isn't a matter of opinion, this is objectively true.

https://www.reddit.com/r/announcements/comments/3sbrro/accou...


Well, one thing that you may have failed to notice is that this is relatively recent: Less than a year ago, shadowbans were a common ocurrence on Reddit. Note how it said that suspensions would replace shadowbanning.

And it's a claim until you give evidence. Until a day ago, as of now, you didn't.


The problem is that reddit is trying to be both a neutral platform and a cohesive community at the same time. No one gets mad at phpBB when someone deploys a forum with an objectionable theme because it's understood that they had no involvement, they just provided the code for anyone and everyone to use in accordance with a FOSS license. reddit tried to do that with hosted subreddits, but it also tried to cross-pollinate and make one big reddit family and see everyone on reddit.com as "redditors". That results in a lot of infighting and resentment (/r/the_donald v. /r/enoughtrumpspam, /r/atheism v. /r/christianity, etc.), not to mention some doxxing, harassment, brigading, and invasions.

It's tense in the real world when people with diametrically opposed worldviews are forced to mingle, but it can sometimes go OK because the human aspect tempers it. Occasionally, with especially open-minded participants, a friendship can be kindled. That doesn't seem to happen at all when these people are not put in a room together, but on a message board, especially an anonymous message board.

reddit didn't know whether it wanted to be a warm fuzzy community, for which common ideals and values are foundational, or whether it wanted to be an agnostic, neutral, unfeeling platform provider. IMO that's responsible for a big part of the culture clash that we see on reddit, and it's left them in the awkward position discussed here.


/r/the_donald had the "sheriff star" still displayed several days after it hit the media.

It's such thinly veiled hate speech. The tone of the denials hint to those who are anti-semitic that this will be their platform from which to indulge in their hatred.

Even a decade ago, no large corporate forum would put up with this sort of liability. It's crazy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: