Negativity Bias

Karl Pribram and colleagues have presented evi...

While scientists have only fairly recently gotten around to studying cognitive biases, philosophers have been teaching about them for centuries-typically in the form of various logical errors. However, it is good that the scientific attention to these biases is serving to attract additional attention to them.

Everyone of us is, of course, loaded down with all sorts of cognitive biases. Some scientists even claim that such biases are hard wired into the brain, thus making them part of our actual anatomy and physiology. If so, it would seem to suggest that people might be more or less biased based on the specifics of their hard-wiring. This would help explain some of the variation in people when it comes to being able to reason well.

While we all suffer from cognitive biases (and other biases) we do have the capacity to resist and even overcome such biases and reason in a more objective manner. As this takes effort and training (as well as the will to want to think critically) it is not very common for folks to try to overcome these biases. Hence, bad reasoning tends to dominate.

One standard bias is known as negativity bias. While some people are more prone to focus on the negative than others, apparently we all have an inbuilt tendency to give more weight to negative information relative to positive information. This would help to account for the fact that people tend to consider a single misdeed to outweigh a large number of good deeds.

Of course, people do also have other biases that can lead them to weigh the positive more than the negative. For example, people tend to ignore or downplay negative aspects of people, causes, and things they like and weigh the positive more heavily. This often involves embracing inconsistency by applying different standards relative to what one likes or dislikes (see, for example, how Fox News and MSNBC in the States evaluate various political matters).

Interestingly, this bias seems to occur at neurological level. The brain actually has more neural activity when it is reacting to negative information than when reacting to positive information. Assuming these results apply generally, we are actually hard-wired for negativity.

The defense against this involves being aware of this bias and exhibiting even greater caution in assessing negative information-especially when it involves negative information about something we do not like. For example, folks who dislike the Tea Party will weigh negative information about them more heavily than positive evidence and will tend to make little effort to determine whether the evidence has been properly assessed. The same holds true for folks who dislike the Occupy Wall Street movement and its spin-offs. They will take any negative evidence as being quite significant and ignore or undervalue positive evidence.

This bias does help explain a great deal about how people see political events and assess them.

Enhanced by Zemanta
Leave a comment ?

22 Comments.

  1. Good..concise! Now if we could just get more people to understand and apply so we can stop taking what the news and political pundits say as gospel!

  2. s. wallerstein (amos)

    This negative bias has a pay-off: we can feel morally superior or simply superior to others when we focus on the negative side of their character or their policies.

    Thus, negative bias serves to boost our self-esteem.

    Think of how malicious gossip functions, for example. Some people spend most of their day gossiping about the negative aspects of others and that allows them to feel great about themselves.

  3. “While scientists have only fairly recently gotten around to studying cogitative [sic] biases, philosophers have been teaching about them for centuries[—]typically in the form of various logical errors.” The second part of this statement is not true.

    First of all, cognitive (not “cogitative”) biases and logical errors are entities of different kinds. A logical error is an act—something that one commits—while a cognitive bias is a causal principle that manifests itself in one’s cognitive acts (one’s judgments).

    Second, there is no one-one or even many-one correlation between logical errors and cognitive biases. From the mere fact that someone commits such and such a logical fallacy one cannot conclude that it was due to such and such a cognitive bias, nor from the fact that someone’s judgment shows such and such a cognitive bias can one infer that it shows this or that logical fallacy. The grounds for attributing cognitive biases and the grounds for identifying logical fallacies are entirely distinct.

    I might be fair to say that psychological research on cognitive biases could never have begun had not philosophers done research on logical fallacies; for the identification of cognitive biases requires the ability to identify errors in reasoning. But the psychological research is novel and is not simply a reinvention of some logical or philosophical wheel.

  4. ^ Typing error: last paragraph should begin “It might be fair.”

  5. If A has the cognitive bias of wishful thinking then her reasoning will (non-explicitly) run along the lines of

    I want P to be true.
    Therefore, P is true

    and displayed thus the logical fallacy of the same name is shown to have been committed.

    A cognitive bias simply consists in the tendency to commit a given fallacy, and showing the tendency to commit said fallacy shows you have the bias.

  6. Jim, the argument scheme that you present cannot plausibly represent any instance of wishful thinking, as it exposes the game that must be concealed for wishful thinking to take place. Even Gary Curtis, who classifies wishful thinking as a fallacy in his “Fallacy Files” website and uses the same scheme as you do (perhaps that is where you got it?), makes this comment:

    Of course, this type of thinking seldom ["seldom" seems to me an understatement--M. R.] takes the explicit form of an argument from a premiss about one’s belief to the conclusion that one’s wish is true. Such bald wishful thinking would be patently fallacious even to the wishful thinker. Rather, wishful thinking usually takes the form of a bias towards the belief in P, which leads to the overestimating of the weight of evidence in favor of P, as well as the underestimating of the weight against.

    Logical fallacies are represented in argument schemes; wishful thinking cannot be so represented, because no characteristic scheme of argument is associated with it. It functions precisely by not manifesting itself in any characteristic scheme of reasoning. It operates upon the content of reasoning and judgment, not in it. It has a psychological profile, not a logical one, and, so far as I am aware, is not among the logical errors that have been taught for centuries by philosophers, which are the concern of the claim of Mr. LaBossiere’s that I am contesting.

    You say that a cognitive bias is simply the tendency to commit a particular logical fallacy. What logical fallacy, then, is, say, anchoring a tendency to commit? Or attentional bias? Or the backfire effect? I am simply taking items from the list in the Wikipedia article “List of Cognitive Biases.” I don’t doubt that one could press these biases into the mold of an argument scheme; but the results will be ridiculous. Consider anchoring, for instance–”the common human tendency to rely too heavily, or ‘anchor,’ on [sic] one trait or piece of information when making decisions.” Are we to represent that bias as a tendency to reason as follows?

    (1) Piece of information A, which is pertinent to my decision about B, is present in my mind.

    (2) Therefore, I will make my decision of A on the basis of excessive reliance on B.

    I don’t think so.

  7. No, I want what I said about wishful thinking to be true and therefore it is.

  8. :D ha ha. I think the main point is that philosophers and psychologists are answering different questions even when they are looking at a common subject matter, namely the ways in which human thinking goes wrong.

    The philosopher, as practical logician (on the model of the Aristotle of the Sophistical Refutations), wants to identify the tricks of reasoning that lead us to unwarranted conclusions in argumentation–the attempt by reasoning to persuade an audience of some conclusion. This requires that the reasoning be made explicit in words. The psychologist wants to identify the natural tendencies that make human beings susceptible to faulty judgment, with or without explicit argumentation.

    So a philosopher might ask of some bad argument, “What makes this a bad argument, and by what devices does the speaker manage to make it look like a good argument?” while a psychologist might ask of the same argument, “What general tendency makes us susceptible to this bad argument?” However, it is not likely that a psychologist will take an argument as the starting-point of investigation. He or she will more likely look at how people answer certain sorts of question or make certain decisions, and (under the “heuristics and biases” program) try to identify the patterns of faulty reasoning and judgment that they show.

    I thank you for your challenge to my initial comment, as I thank Michael LaBossiere for his article, as they have compelled me to think these issues through.

  9. Thanks Miles, you are, of course, quite right – I was talking gibberish.

    x desires p to be true, therefore x believes p to be true

    There’s something that seems to be, loosely speaking, ‘invalid’ about this form of belief-formation but its nothing to with logic. How we would apply a ‘logic’ to propositinal attitudes I don’t know.

  10. It is interesting how some of us are bias towards cats or dogs.

    And it’s good that most of us are bias towards the negative. It makes us cautious, especially in business. Nothing like a pessimistic optimist.

  11. They’ve also done psychological studies on ‘belief bias’. Subjects are specifically asked to indentify logically invalid arguments but they exhibit belief bias by rejecting valid arguments with conclusions they believe to be false, and endorsing invalid arguments with conclusions they agree with. Presumably this is a phenomenon Mike must come across.

  12. Dave,

    Like you, I believe it would be great if people were more critical of the pundits and the folks on the news.

  13. S.wallerstein,

    In an interesting coincidence, I was just teaching Hobbes in my ethics class last night. Hobbes says (as you note) that people delight in speaking of the faults of others so as to feel better about themselves.

  14. Miles,

    Logical fallacies can, as you note, be represented in argument schemes. However, a person committing a fallacy need not reflect on the scheme of the fallacy being committed. The scheme, rather, “formalizes” the error. Also, people do often follow fallacious forms quite explicitly-(for example when committing a post hoc fallacy) without being aware of their error. They know what they are concluding and from what evidence, but do not know they are making a mistake. As far as wishful thinking goes, I regularly see students explicitly commit the fallacy. For example, I have heard students say “I know I passed this test-my parents would kill me if I failed it!” The student “knows” what s/he is saying and is engaged in an inference-s/he just does not know it is an error.

    Some biases do not have (as of yet) associated fallacies. After all, biases can also be looked at in the context of various non-argumentative rhetorical devices. I am not claiming that all biases are just renamed fallacies.

    Now, if your main concern is that I seem to have implied that psychologists are not doing anything new in regards to biases, then I apologize for creating that impression. Some of the studies of cognitive bias haven been helpful and it is useful to gain more precise understanding on how the mental process tick away.

  15. Jim,

    I do see that quite often. Based on follow up conversations, it often seems to come from the fact that people often fail to distinguish between the quality of the reasoning and whether or not they think the claims are true. Students often think that valid means “all true premises and a true conclusion.”

  16. Jim,

    Your comments got me thinking about Hume. When discussing the problem of the external world, Hume says (roughly) that it is vain to ask whether bodies exist or not. Rather, what is to be asked is why we believe this. This in turn got me thinking more about fallacies and cognition. Consider the following example:

    Biff: “Those occupiers are nothing but filthy, violent hippies. So, all that crap they say about corporations having too much influence is bull!”

    Looked at one way, Biff seems to be obviously committing an ad homimen. That is, he concludes the occupiers are wrong because he does not like them.

    Looked at another way, it could be said that Biff simply believes that the occupiers are wrong because he also believes that that they are filthy, violent, hippies. That is, a psychological causal explanation is given of Biff’s belief (“why does Biff believe what he does?). Biff is, presumably, not explicitly aware that he is committing the fallacy in that he did not consciously think “I dislike these people, therefore what these people claim is false.”

    However, I would say that Biff has still committed the fallacy, even though he might be ignorant of the actual pattern of his reasoning. That is, when his line of reasoning is assessed, it is, in fact, flawed.

  17. Hi Mike,

    Interesting thanks. Yes I’d been inclined to think that fallacies can be committed without people realising what the pattern of their reasoning is.

    Logic does seem to concern itself with revealing bad reasoning – and showing its pattern – not just bad arguments. Maybe there’s some interesting work for you to do identifing ‘new’ fallacies by looking at cognitive biases?

  18. Jim,

    I agree with you on that point. Also, I would say that people can reason well using patterns that they are not actually aware are argument patterns. For example, my students use Modus Ponens all the time correctly and then, in my critical thinking class, often don’t recognize the pattern when I write it out on the board. I suspect people reason without reasoning about their reasoning. :)

    I think you are on to something-I think that research into this biases will help us identify ‘new’ fallacies-perhaps by refining some existing fallacies. In any case, I am glad that psychologists are investigating this matter-if only because psychological studies get into the news in the US and we philosophers largely get ignored. :)

  19. Yes reasoners often won’t need to know the rules they are using as such. Its rather like grammar I suppose – native speakers might follow the rules impeccably but have difficulty enumerating them..

    Philosophers ignored?

    Plutarch records how, having been sent in a delegation of thinkers to Rome, “Carneades’ oratory.. gathered large and favourable audiences, and ere long filled, like a wind, all the city with the sound of it [and] impressed so strange a love upon the young men, that quitting all their pleasures and pastimes, they ran mad, as it were, after philosophy.”

    Times have changed somewhat…

  20. I think our cognitive bias for negativity comes from our instinct for survival. This comes from our past when self-survival was more of an issue, when we were more on the lookout for those things that could do us in.

    Rightly so that in the past we worried more about survival. It was a harsher world. Worrying is a negative. But worrying can have a positive effect. Worrying can be an indication that we are concerned about the future and preparing for it so that we will continue to survive. Here a negative turns into a positive. Minds do think perversely.

  21. Philofra,

    I’m in complete agreements with the points you make. And I’ll go further and say since the human race began, our negative-bias has evolved to ensure our survival and will continue to evolve into the future in our DNA – to serve the same purpose – survival.

    And on the basis of this purpose, philosophers are more interesting than psychologist – but both serve a good purpose.

Leave a Comment


NOTE - You can use these HTML tags and attributes:
<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>