Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Stop responding to it.

No seriously, just stop. This isn't a "complex problem that needs nuanced technical and legislative solutions".

Back in the old day, there was this saying. "Do not feed the trolls". Sadly, we've forgotten that.

Our current approach is, "create rigorous 30 minute point by point take down videos to defeat their point of view". Our urge to debate and correct people who are wrong just fuels them making more content. A troll needs reactions to survive. Just downvote and move along.



We found a lot of lies during the UK election last month. Not difference in opinions, not beliefs about whay may or will happen, easily proven lies, posted across community forums, copied and pasted to other ones, and repeated in an increasing cresendo

People then believe those lies, they repeat them, and even if they don't those lies sink into their subconcious and change their behavior, not necessarily today or tomorrow, but for the next 30 years

Ignoring them doesn't fix the problem.


> Ignoring them doesn't fix the problem.

"The problem". What problem are we talking about again here? Is the problem that people are believing wrong things, or the spread of polarization and toxicity?

I don't care if people believe "wrong things". How's that old song go? "All lies and jests, still a man hears what he wants to hear, and disregards the rest"? People will always believe whatever they want to, regardless of "evidence", regardless of history, regardless of persuasion. "Post-truth" society is such a ludicrous term. There was never a "truth society" to begin with. Never. If someone doesn't want to change their mind, you're tilting at windmills. Doubly so if this person is an anonymous account.

What I care about is stopping the spread of toxic and divisive stereotypes.

Look at the front page of Reddit. Every other post is a screenshot of an outrageous anonymous tweet, or a picture of a receipt with a nasty note on it, or a text message conversation. Look at how the public receives this: "Who cares if it's real or fake? The point is that it reinforces my stereotype of 'them'. We know 'they' exist, and we must do something about 'them'."

That's what's driving polarization and toxicity. The urge to respond with outrage to anonymous trolls. This is a problem we can fix by down voting and moving on.


> I don't care if people believe "wrong things".

But those beliefs lead to actions, which have negative consequences, which can affect us all.


Indeed, but attacking that with anger, outrage, and witch hunts rather than tact is not only ineffective, it may even create a positive feedback loop.


Actually, it is rather effective. Your immune system doesn't rely on tact once a particular foreign body proves to be hostile or dangerous; rather tt initiates inflammation and countermeasures.


We must be careful when using microbiological metaphors to propose solutions for macro social problems; such language has often been used to justify genocide (i.e.: "hygenic cleansing of parasites"). Besides, what we're talking about here is more akin to autoimmune disfunction, and not healthy immune function.


Besides, what we're talking about here is more akin to autoimmune disfunction, and not healthy immune function.

No it isn't. Responding to toxicity or an external insult is precisely what the immune system does. An autoimmune disorder would be an activation of the immune system against some other bodily system that is healthy and functioning.


An autoimmune disease is a condition arising from an abnormal immune response to a normal body part. People having different opinions is normal.


People having different opinions is normal, but not all opinions are thereby valid. Promoting genocide or child abuse have a deleterious effect on people in the in the respective target groups and so we sanction such opinions because of their demonstrated harms that result from their implementation.


The problem is how people define and extrapolate what constitutes promoting genocide or child abuse is often erroneous.

For instance: some people consider anti-vax sentiment to be child abuse, but dont consider cross-sex hormone injections for young children to be child abuse (and vice versa).

Another example is how many consider the swastika to be a symbol of genocide, but give a free pass to the hammer and sickle -- even though both of these symbols' ideologies have been similarily genocidal.


Honestly that's somewhat nonsense. People don't just believe whatever they want to believe, education and diffusion of information has radically changed humanity over the last few centuries and you're fooling yourself if you think that somehow improved communication and discussions didn't drive that. Schools have improved reading and mathematical literacy of the general population, that isn't just some made up statistic.

Humans are incredibly social beings, they hide in each other, as well as grow in each other. People absolutely can and do change their minds on all sorts of things every single day, and it's a "throw my hands up in the air and quit" to say otherwise.


Yes... and also no.

Yes, people change their minds. But the more passionately they care about something, the less likely they are to change their minds about it. On the big questions that people care deeply about (politics, religion, vi vs. emacs), evidence doesn't seem to change peoples' ideas nearly as much as it should.


> I don't care if people believe "wrong things".

If you are an ethnic minority in a society where xenophobic nationalists believe you should be killed, do you think you'd care if people believed those "wrong things"?


You might find that the people you see posting and repeating these lies see the things that you post and repeat as lies.


They frequently do see things in that way - and they're wrong. In fact, they're worse than wrong, because many people see the dueling assertions of fact and give up on knowing the truth of a situation altogether. This is a classic disinformation technique.

We do have the ability to determine truth from fiction, it just takes more work than cynically throwing up your hands and smugly concluding all sides are lying.


They're not necessarily wrong. For example, one of the big anti-Brexit talking points beloved of both Corbyn and social media users before the election was that a trade deal with the US would mean maggots in orange juice. This also, as I recall, became part of the narrative about Brexit supporters being mislead, with the line being that they were sold all those false dreams and would get maggots instead. It was a complete and utter lie: https://fullfact.org/health/maggot-orange-juice-USA/

And I really do mean a complete lie. This wasn't a dispute over creative redefinitions of terms like Boris' NHS funding claims, or some nitpick about a minor detail. Corbyn and a worryingly large chunk of the press took an example of US food safety regulations being unambiguously stricter than EU ones - of them placing all the same restrictions on food manufacturers that the EU does, plus more - and falsely claimed their regulations were weaker instead, using graphic, emotive, memorable language, and everyone believed and repeated this total inversion of the truth.

The thing that gets me is that on some level, people must've known that US food isn't really unsafe. There's not some big movement of people who refuse to eat US food - by and large, everyone trusts food there as much as they would food here. The other thing is that despite this pretty much the entire press dropped the ball on this lie. (It is not even close to the only prominent anti-Brexit or anti-Tory lie they left unchallenged or outright made themselves.) Even Full Fact did initially - they actually helped spread the false claim themselves, and only came back to it a month or so later after it had spread across the internet multiple times and been used by Corbyn repeatedly in his campaigning.


The desire to "correct" other people who are wrong is the fundamental impulse of authoritarianism.


> The desire to "correct" other people who are wrong is the fundamental impulse of authoritarianism.

Maybe, but it's also the impulse of people who rightly care about the future of our shared world.


> Maybe, but it's also the impulse of people who rightly care about the future of our shared world.

Literally no authoritarian says "i want to suppress people because I'm evil mwahaha." Every single authoritarian says "i need to suppress this evil disinformation because i care about the future of our shared world---unlike you unworthy people who presumably don't care."


Rightly according to your subjective experience.


So... would you consider that an argument against the sanctions levied against, for example, doctors who lie and fake data to stoke fears about vaccines and autism? What about fighting against conspiracy theories about the deep state and pizza parlors?

There's a tendency to wishfully think that people are rational and should be expected to sort out their own information environment with no governmental influence whatsoever, but I don't buy it at all. We're predictably manipulable, emotional creatures, and we can decide rationally to improve our information environment without falling into your implicit slippery slope.


> would you consider that an argument against the sanctions levied against, for example, doctors who lie and fake data to stoke fears about vaccines and autism?

Yes. I never said authoritarianism is bad. In fact, I probably consider myself an authoritarian.


Furthermore, it is rationalism which exists as the first non religious justification of hierarchy. Read Plato.


"People are lying" - "They'd say that you are lying" - "BUT I AM RIGHT".

That's not rationalism, that's just saying "I'm right, they are wrong, end of discussion".


That's because this conversation is occurring in the abstract - surely you can think of examples which don't fall into your supposed pattern (flat earth, pizza parlors, etc.)


But those are super edge cases, the vast super majority of things aren't clear cut at all. "Is that policy going to increase employment? Will it depress wages, and by how much?"

Good luck with judging who's right and who's wrong.


These "super edge cases" are happening monthly. THAT's the problem. The examples given weren't a historical review, they're a sampling of what happened in 2019! If these were in any way rare or outside the norm, I'd feel safe ignoring them, but even in cases of absolute truth and fact (jade in your vagina does not cure anything) we get a netflix series.


> These "super edge cases" are happening monthly.

Yes, in a world with literally billions of people, you have a few millions who believe outlandish things. If that was the extent of the problem with polarization/toxicity on the internet (or public debate in general), I doubt we'd talk about it, and I'd be very happy with the state of the world.

Those people have no large base, no stable membership, no money, no power. Focusing on them is like decrying the fall of science because 6yo Timmy still believes in Santa Clause.


You say "millions of people believe foolish things", and then completely disregard exactly how many there are and how concentrated they become. I disagree with the assertions at the end that "a few million" do not constitute a large following, and allow me to provide a few counterexamples:

* the Flat Earth Society has a very stable membership and patreon. Mark Sagent's youtube channel alone has 58k subscribers. Social media influence is the source of money, and a power all on its own. * Gweneth Paltrow's pseudoscience has a facebook group with 500k members. She has a netflix series and a reliable income from her online storefront. The facebook group came first, then the netflix series. * QAnon is a persistent conspiracy theory with no basis in fact. Regardless, tripcodes (a public hash of the password used for identity verification on 4chan) denote a persistent online identity, so he's got a following... and the following is what causes power.

Power in its purest form is asking someone for something and getting it. This looks different in the modern age than it did previously, but saying that celebrities don't have power belies the entire concept. These are celebrities, either advocating obviously false things, or due to their advocation of obviously false things, and millions of people are taken in.

In contrast, the expected Iowa caucus turnout numbers are going to be around 60,000. Or, in other words: There are more people believing in flat earth than there are democrats caucasing in Iowa. How in the world is this not a problem.


> Or, in other words: There are more people believing in flat earth than there are democrats caucasing in Iowa. How in the world is this not a problem.

I mean, isn't the answer already in these sentences? The world vs Iowa.

It's not that I don't believe pseudoscience and cults are a problem, it's just that they are a small problem on the grand scheme of things. Increasing polarization of society at large is a problem on a different scale. It's something that has very tangible effects for most people, some guy believing that the earth is flat and having 60k people watch his videos really doesn't.


9-11 was a super edge case. Yes, absolutely it's difficult to make firm judgements about many issues, but we should still pay attention to edge cases because even though the people out at the edges of discourse seem nuts doesn't mean they're not serious or motivated or capable.


Sure, sure, but "serious, motivated and capable" still doesn't give them leverage. You will always have individuals committing terrible things, there will always be the next school shooter or terrorist, but those are, while tragic, small events, and if you didn't turn on the TV, you usually wouldn't notice them if you lived a few hundred miles away. Change the policies of a nation and you'll have a much larger effect that can be felt everywhere within its borders.

So sure, paying attention to the edge cases is fine, but focus most of your attention on the big issues.


Changing policies is the point of terrorism. You seem to be assuming it's randomized and atomized rather than itself being networked and (loosely) coordinated.


also puritanism


No, the desire to correct other people who are wrong is the fundamental impulse of pedantry.

The desire to compel others to conform to one’s ideal of correctness is the fundamental impulse of authoritarianism.

...but also libertarianism, for a particular aspect of correctness.


Huh? Can you explain?

My understanding is that the core value / principle of libertarianism is individual liberty.

How do you derive compelling others from that?


Yeah, I agreed until his comment went off the rails there. I have noticed a movement to try to brand libs as 'crypto-republicans' recently for whatever reason. Its okay, and rational to not only think along 2 (or 3 or 5) party lines.


> My understanding is that the core value / principle of libertarianism is individual liberty.

Yes, and libertarians tend to hold that it is right and proper (and even often define “violence” to exclude this use of force) to use any degree of force necessary to get others to observe the individual liberties libertarians see as essential.


> We do have the ability to determine truth from fiction, it just takes more work than cynically throwing up your hands and smugly concluding all sides are lying.

It also takes more work than "But I'm right!"

If you've done the research, if you know that you're correct, it's one thing. (Even then, not everyone who is incorrect is lying.) But it's a real temptation to say "I'm right, they're wrong, they're lying" when you haven't actually done the work to be sure you're right.

I am prone to that temptation. You almost certainly are, too. So be careful. You can be the one in the wrong.


I don't disagree with anything you've said, but I often find myself in situations where I have gone out and tried carefully to understand what's true. When you've done the research, it's very frustrating to be confronted with exactly what you're describing.


Sure. But even then, knowing that in other circumstances it can be you behaving that way can give a bit more empathy for the other person. That doesn't make it less frustrating, but it may help you handle it a bit more gracefully.


I definitely wasn't cynically throwing up my hands and smugly concluding all sides are lying. All I intended to do was demonstrate how cheap it is to call someone a liar on the internet. It costs nothing to the reputations of you or your "lying friend" to call one another a liar and thats why it doesn't mean anything. The sooner people realize this the better off we'll be.


> I definitely wasn't cynically throwing up my hands and smugly concluding all sides are lying.

Apologies, I didn't mean to insinuate that you were. My point was that this is the usual (and often intended) consequence of your argument, as I understood it and have seen it in the world.

Calling someone a liar doesn't help by itself, but if we can bring clear and irrefutable evidence to a discussion, I think we can reasonably reject outright incorrect views.


My prediction is that this is what will, in the end, change; technology will advance far enough to severely de-anonymize online interaction, at which point one's reputation again comes into play and people will find there are consequences for thoughtless shitposting.

It has begun to happen, with phenomena such as people being fired for Facebook or Twitter posts. It doesn't seem impossible to tie a sufficiently-advanced pattern-recognition network to the firehoses of data on YouTube, Facebook, Twitter, et. al to automatically tie together the scraps of personal info people accidentally leave behind and severely compromise online anonymity.

Be interesting if it happens.


Economically I think this will be huge. Reputation based economies are incredibly more efficient than regulatory ones. Theres no amount of regulatory compliance that can tell consumers the information they want in the way a 2 star average uber driver review does.

On the social side I can't even imagine how destructive it could be. People (especially on HN) love to lament how restrictive and oppressing the "everyone knows everyone" small town life can be. So now that we have the anonymity online we can escape to, we've all turned around and started building the biggest "small town" imaginable! One with no escape.


Exactly. Mostly because we've discovered that the small town is oppressive, but the Wild West is full of gangsters. ;)


> technology will advance far enough to severely de-anonymize online interaction, at which point one's reputation again comes into play and people will find there are consequences for thoughtless shitposting.

This is naive. Facebook has just as many extremist bigots as any other anonymous platform.


[citation needed]


The biggest problem from the last 10 years is the spread of "everyone's entitled to an opinion". It seems that objective truth no longer exists in the eyes of the population.

This is both a cause and a consequence of the defunding of journalism and the change of the profession from news to views.


Erm... defunding journalism by who?

Private funding is making the situation worse.

Public funding is the instrument of authoritarian regimes- state sponsored media.

I think the root cause is lack of objective, critical thinking skills, and the glorification of "how it makes you feel" being more important than "is this reasonable".

I.e. emotion driving reason, rather than reason driving emotion.


By society. We don't want to fund people to separate fact from fiction, yet we lack the time and ability as a population to do it ourselves, so we latch onto outlets (especially celebrities) that seem to match our world view, and we treat what they say as gospel.


I think the vast majority of humans are simply disinterested in objective truth, except to the extent it's actionable in our day-to-day lives; otherwise, our sense-making is driven largely by the utility of social signaling: https://meltingasphalt.com/crony-beliefs/

See also: Donald Hoffman's experiments in truth-maximizing vs. fitness-maximizing: https://www.edge.org/response-detail/25450


Isn't everyone entitled to an opinion though? What you're hitting on, I think, is that the so-called "toxicity" of the internet is just broad disintermediation. Previously, a few people controlled the public narrative. Now, everyone does. Turns out most people don't very much like the worldview that the media elite pushes.

Journalism is dying because it's become not only useless as a means of information dissemination --- new media is better at that --- but also because journalists have become contemptuous of the public. They have only themselves to blame.


That's not a novel phrase, and it's also at least partially a peace treaty between worldviews with fundamental differences.


And if it's an opinion on things like "should it be legal to have sex before marriage" or whatever that's fine

The problem comes when you have 5 civil engineers saying a bridge is not safe, and 50 random members of the public saying it's safe. If we treat those opinions as equal, we end up with a bridge collapsing.


And one of the sides has to be right. Sometimes the facts are difficult to ascertain, but so many of the lies spread via social media are easily disproved by consulting primary sources.


But the thing is, while facts are objectively true or false, a policy choice can't be objectively right. What you often see is, policy masquerading as facts.

"If you accept X then we need to do Y."

"I don't think we should do Y"

"Then you reject facts."

Something everyone seems to have forgotten is that intelligent, well-informed, people of good will can look at the same facts and come up with different policy prescriptions.


As I've gotten older, I've started to doubt the idea that there are any objective facts at all, or at least if there are, the human brain has a limited capacity to comprehend and communicate them.

(Edit) This doesn't mean I don't believe in truth, right/wrong etc... it means that I'm constantly balancing what's most likely to be trueish - subject to higher quality information at a later time.


a policy choice can't be objectively right

A policy choice is the linkage of a fact to a particular goal; while few policy choices are so simple as to admit of a binary choice, you can certainly rank them on a gradient.

Of course, it helps if your goal is clearly definable and you maintain awareness of consistency. Otherwise, a goal of, say, improving life expectancy might be satisfied by a eugenics policy which made unpersons of those with medical conditions that would lower life expectancy.


But a goal cannot objectively be right or wrong. It could be agreed upon, but it can never be true in the sense that objective facts are true.

To go even further, people may agree on the 'what' of a goal, but disagree on the 'why' of a goal, which very much inform what policy choices they are amenable.


Perhaps the following is obvious, but other possibilities may exist:

- both sides are wrong

- the participants are unwittingly talking past each each other

Etc.


Even more than this- the desire to silence and destroy those who disagree, sometimes physically.

Some have called this increased polarization.

I see it as a slide towards violent authoritarianism.


> but so many of the lies spread via social media are easily disproved by consulting primary sources.

I wish this were true. I used to post snopes links and primary sources to Baby Boomer posts on Facebook, but it's hopeless. They either don't trust the fact-check, can rationalize it away, or just don't care. One of the most shocking realizations of my adult life has been learning that a very large portion of my otherwise high-functioning friends will believe anything, no matter how crazy or self-contradictory, if it reinforces their sense of self-righteousness.


And a whole bunch of people that see the minority or unpopular opinion as more valid because of it.


Why exactly should they trust Snopes in particular, as opposed to Washington Post, Fox News, RT, the North Korean news agency, etc.


Oh, no doubt. Easily disproving something is very different than convincing someone that it's disproven.


>And one of the sides has to be right

Not necessarily. In the US one of our presidents taught us a long time ago that "both may be" wrong, and "one must be" wrong.


In the root, there are people who knowingly lie. Hitler knowingly lied. Stalin knowingly lied. Many many people lie being fully aware they lie.

The hell, average maanger lies.


The problem with lies in politics for me is that every politician needs to do them to stay competitive. Also every party has politicians with different objectives, which makes things even harder. Also a good leader should be well compensated, at the same time people are envious of politicians who make a lot of money legally, which makes corruption for politicians necessary.

The only improvement I see is decentralization, which is a very slow, but powerful process. The Gutemberg printing press showed that it is possible, and Bitcoin is doing the same thing right now, but I think superior technology for organizing people and resources is the only solution.


I don't know if ignoring lies is necessarily a problem. Because what are the realistic chances that anyone responding to non factual information, is responding with factual information? Probably pretty low on the net if we're being honest.

That said, yes, we absolutely cannot ignore the toxicity in its entirety. Ignoring it entirely is how little old ladies at church bible studies wind up dead. There are certainly classes of toxicity that it's just military sense not to ignore. Religious and ethnic extremism, etc. Basically anything that is going to cause issues with physical violence. To my mind, violence is the line.


I have trouble seeing responding as much of a fix, although it depends a bit on the platform. On Reddit, Facebook, and the like, there is a pattern where you have highly upvoted/liked posts, where the top comment is about how the thing is obviously wrong.

Most people just glance at headlines, image, or whatever, and don't read articles or examine comments. Those comments about it being wrong only help inform the subset of people who bother to look at comments.


I think its a fallacy at this point to assume the people pushing toxic or objectively wrong viewpoints are trolls. If it ever was 100% trolls, that time has passed and their target audience is expanding their work in earnest.

We now have people who 100% believe objectively wrong things and have an obsession to spread their belief as fact.

Edit: If there is a solution to this issue, it will depend on the ratio of these people who are willing to change their mind. A real solution might involve a procedure to move people from the more stubborn camp to a more open minded camp.


When someone on the street corner tells you the world is ending you ignore them. You dont need a complex technical or social solution.

When someone tells me that xyz plant oil cures cancer or the earth is flat... I ignore them. If I want to be a troll I may play along or challenge them if its fun.

That's the internet. There have always been nutters- they just found other nutters to talk with and you get a loud feedback loop. You can happily mute them still.


I don't think ignoring the nutters is viable. Enough people are believing anti-vax lies to start bringing back deadly disease which affect more than just themselves. Enough people are believing climate change propaganda which is slowly harming the habitability of our planet for our human race.

What was once contained on the internet is leaking out onto the streets and is already affecting you and I in many ways.


Propaganda has always been there.

During the Golden Age of broadcast TV and Radio (pre-internet), the big networks broadcasted plenty of unchallenged lies. The difference now is that laymen can fact check the content put out by the big players. But the price for this is that every troll or nutter has access to the some of the same tools to spread their own lunacy.

I far prefer the current situation over the previous. While it may have felt more comfortable when people believed that news readers were telling the truth or reporting truth, it never was that way.


Trolls (either human or bots) are way more dangerous that TV propaganda because the act as legitimate actors in the public discourse.

It gives them the ability to hide between other people and manipulate the conversation.


It's interesting that you mention climate change in this, I agree that it's ground zero for demonstrating the phenomenon of expressing things as objective truth, or fact, that are much more complex than a simple binary.


>When someone tells me that xyz plant oil cures cancer... I ignore them.

What happens when you see that same person talking to a cancer patient trying to sell them plant oil?


'The world as we know it is ending and people like crmrc114 are to blame. There s/he goes right now, messing up the world as we know it!. You know what to do people.'

Added as we know it because it's actually quite easy to persuade people that what they like about the world is under dire threat, as opposed to the objective existence of the planet or life thereon. People's individual worlds tend to be quite small and it's easy and socially/politically profitable to market to them with threats rather than inducements to expansion with all the unpredictability that entails.

Now, imagine that you've been identified to or by an angry group as the cause of their dissatisfaction. This is a very different dynamic from 'someone on a street corner' that you can usually safely ignore, and against whom you probably feel you could defend yourself if necessary. I don't think you'd be so dismissive of nutters in groups if you were negatively impacted.


What happens when your child, sibling, friend, parent, or lover is saying those things that they got from the internet?


The problem is when people with such beliefs acquire position of power or influence, which sadly has already happened.


When you ignore them you tacitly endorse their position. Their propaganda is being broadcast uncontested, there will be people who believe it. Those people will spread it to other people. Pretty soon whooping cough is back and people are dying.

The truth can't advocate for itself.


>We now have people who 100% believe objectively wrong things and have an obsession to spread their belief as fact.

That's not remotely a new thing. Ever since bits could be sent over the wire we've had Dale Gribbles trying to spread their weirdass conspiracies using the web. Only difference is that back then it existed in the form of homemade webpages with remarkably bad color theory rather than the poorly punctuated social media posts of today.


I think the difference is that it’s not fringe obsessives but rather everyday people spreading bad info. Stuff like QAnon, white nationalism, anti-vax, GamerGate and so on might be mostly started and kept well alive by fringe obsessives but plenty of normal people are sharing their memes, lies and content in an uncritical manner in a way that’s very different to personal websites. Social networks have made it much easier for these lies to be spread and repeated whitewashing their true origins and becoming accepted.


> We now have people who 100% believe objectively wrong things and have an obsession to spread their belief as fact.

There's usually enough truth behind every lie - enough to make it compelling enough to believe in. I'm sorry but I don't think it's as obvious as you're making it out to be. We all probably believe something that's objectively wrong. The truth is really in the middle, but all I see is people going further towards fringe opinions.


I think it's worth expanding your definitions, eg

trolls: people who say absurd or mean things for lulz fools: people who truly believe absurd or mean things because they are naive or gullible scammers: people who say absurd or mean things for profit, and may present as fools or absurdist trolls when challenged


Part of the problem is that social platforms online are built around heavily optimising for engagement. Content that gets engagement must be better, so it gets promoted more heavily, and algorithms show you personally the content it thinks you're most likely to react to. You can't leave because it's messily wrapped up with your social connections to your friends.

Until this system dies, we won't solve this problem. It's not good enough to instruct people to stop feeding the trolls: there are huge troll-feeding dopamine farms feeding and nurturing trolls by the millions. Until we shut them down the situation will continue to get worse.


You don't have to leave, although I think most people overestimate how much it would hurt their social connections if they did. You just have to refuse to engage with toxic content when you see it, the same way you avoid debating with an obnoxious friend. (Newer social media platforms, unsurprisingly, are designed to make this easier; it's much easier to avoid dumb arguments on Instagram or Tiktok than Facebook or Twitter.)


It's you against a whole bunch of bright, motivated, and very smart people trying to keep you commenting, clicking, and getting involved. It's not that easy.


This is like saying "Just ignore the problem until it goes away," which is cute, but very naive. This is a systematic problem with huge economic incentives pushing things to the extreme. There is massive funding from state and political institutions amplifying and promoting extreme political views. This is come a long way from some lonely dweeb in his basement just looking to get a rise out of someone online.


hmm, I think that in this particular case ignoring it will make it go away.. because in the end, the problem is that people pay attention to it.

suppose nobody paid attention to such content in the first place, would it really be a problem in that case?

I think the tricky part is that if I ignore it, but no one else does, the problem is still there.


Suppose cancer didn't have any negative effects, then getting cancer wouldn't be a big deal.


Not feeding the trolls is good advice for one's personal emotional health, but it has never made the trolls go away. There's too much infrastructure supporting them: they know what works, and they broadcast to everybody. It only takes one response to give them the burst of dopamine they're looking for.

Responding makes it worse for you. It gives them a second chance aggravate you, this time with more precision. Downvote and move along is excellent advice -- especially if you're on a platform that helps you filter them out in the future. But don't fool yourself into believing that they'll go away, or that people aren't listening to their misinformation and incorporating it into their belief system -- even if they know it's false.


I'm only responsible for me. These people won't change, no matter how eloquently I phrase my fact-filled take-down of their arguments. I'll sometimes argue just for the audience, but I can't fix everyone that believes something (I think is) wrong. It's not my place, anyway. People are slowly becoming more "street-wise" on the internet. This is just one of those traps people can only avoid after they've learned their lesson first-hand.


yes, not feeding the trolls is like "don't respond to spam." doesn't really solve either.


I'd agree in some cases, and disagree in others.

In terms of things like 'fake news' and media outlets peddling absolute lies, ignoring it altogether may not be the wisest move. Might in those cases be better to calmly deconstruct the lies and prove the stories are wrong.

In terms of things like personal attacks, political toxicity on Twitter, etc? Ignoring the trolls is 100% the best thing to do. The folks that send threats and mock others online want a reaction, and the more reactions they'll get, the more the person reacting will get targeted by even more of them.


"Don't feed the trolls" has been my strategy for as long as I've been on the web, but what disconcerts me the most these days is the doxing and death threats. A reporter can simply retweet a news article and instantly be flooded with death threats and see their address posted to the web. There's no way I could tell that person "don't feed the trolls". That's a real problem with real life consequences, and I have no clue how you could begin to solve it, aside from total anonymity.


"Stop responding to it."

Then they win elections.


'responding to it' is engagement, and there is a financial incentive for that.

So the algos are boosting controversy, to boost engagement, to boost financials.


Completely off topic, but I love your HN handle!


When you stop responding, AKA reacting, you are not solving the problem, you are making it worse.

In fact I would say that "don't feed the troll" is the root cause of the huge spike in trolling that we have now.

Do not feed the troll meant, from what I understood back then, kill it, not let it grow somewhere else where it's protected.


Exactly. "Don't feed the troll" means report it to the moderator so they can ban them. Letting trolls spew their message unopposed is how you lose your community.


Do not feed the trolls made Kathy Sierra go out of Internet. It blames victims for daring not to be silent. It is like telling bullied kid to "avoid attackers" or "not be shy" or "do what they want".

Ultimately, it rewards trolling behavior.


What's new is the political Echochambers.

Stop feeding partisan BS. You are not Left nor Right!


Even those have been around on the internet since the dawn of the internet.


And the Roman empire.


Some family members that once said "don't believe what you read on the internet" and resistent to the early internet are now some of the biggest sharers of hyperpartisan alarmism online.


Absolutely. I'm afraid to admit how long it took me to figure out that you shouldn't argue with people that aren't willing to change their mind.


+1. These days, if someone tries to steer me into some sort of polarized political discussion, I give my hourly rate and ask to be paid upfront.


Back in the old day, there was this saying. "Do not feed the trolls". Sadly, we've forgotten that.

It has never worked. Put yourself in the position of someone who is being maliciously harassed by trolls and getting no help from anyone else. It's like telling a weak kid that is being bullied to just ignore it; the reality is that bullies are determined and tend to escalate rather than abandon their aggressive behavior. At best they will move on to picking on some other weaker person.


Stop responding is a fools argument. The game has changed a lot since the days of your average internet trolls because the line between internet and reality has become increasingly blurred.

For example, I don't use Facebook. But my parents do. I've had to talk with them on more than one occasion because they would get caught reading objectively wrong things spread on Facebook by bots. Thankfully my parents trust me enough that I can be an authority on those sorts of things, but it's not something I can simply ignore. If I were to let it fester, eventually it could turn into things like 'vaccines contain brainwashing bleach' or 'the government wants to take away your spleen' levels of nonsense.

At a certain point it's a good idea to stop platforming people like anti-vaxxers but it's also irresponsible to say it's possible to just ignore it.


This!!!

There was even a "emoji" of beating a troll back in the days o ubb and vbb




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: