Scroll for the video.
For most people, their brain accounts for about 2% of their body weight. Not that much heft to it when you consider just how much that jelly-like hunk of gray-ish flesh is doing every single second. In fact, about 25% of the glucose in that chocolate bar you’re snacking on right now is going directly to your brain as fuel. A study by the University of California-San Diego showed that the human brain is inundated with 34 gigabytes of information daily. Knowing that, it’s understandable that somewhere in there wires get crossed and errors are made when it comes to decision-making or how an event is remembered. That is where cognitive biases come into the picture. The brain does whatever it can to process everything that gets thrown at it, but it can get overloaded. The brain’s built-in safety net? Fudge some facts in such a way it’s not perceptible, which is why you somehow convinced yourself having a candy bar instead of an apple while you read this was a good idea.
A study by the University of California-San Diego showed that the human brain is inundated with 34 gigabytes of information daily. Knowing that, it’s understandable that somewhere in there a few wires sometimes get crossed and errors are made when it comes to decision-making or how an event is remembered. That is where cognitive biases come into the picture.
The brain does whatever it can to process everything that gets thrown at it, but it can get overloaded. The brain’s built-in safety net? Fudge some facts in such a way it’s not perceptible, which is why you somehow convinced yourself having a candy bar instead of an apple while you read this was a good idea.
It was psychologist Bertram Forer who first penned the theory that people will believe that a vague but flattering description is an accurate description of themselves, without making the cognitive leap that the descriptions could apply equally well to the person next to them, and the person down the street, and the bag of flour in the cupboard. It’s properly called the Forer Effect, but mostly we call it the Barnum Effect after the famous showman and circus owner to whom we often attribute the observation that “there’s a sucker born every minute.”
Astrologers and mediums are masters at using the Barnum Effect.
There’s a story that really brings perspective onto the idea. It tells of an American teacher who had a horoscope done up. It was actually done for a specific person, but he gave the horoscope to everyone in his class and told each student it was theirs. Every single one of them thought the horoscope painted a strikingly accurate picture of themselves, so much so that some of the students who had previously thought horoscopes were just a big con that played upon people’s innate weaknesses vanity changed their minds. Then they were told they’d all been given the same horoscope. Some took this news poorly — they’d really liked how believing in the horoscope had made them feel about themselves. Then they were told that the horoscope had actually been made for Edmund Kemper III or “Big Ed”: serial killer in California in the 1970s who by the age of 15 had shot both his grandparents. Later, he killed and dismembered six female hitchhikers before murdering his mother and her friend. He liked to have sex with dead people.
That’s the Barnum Effect.
Source: Barnum Effect
Sunk Cost Fallacy
The Sunk Cost Fallacy describes the phenomenon where people continue along a path they know isn’t a good one simply because it’s a path they’ve already started upon. It seems like the path of least resistance, but really it’s the road to nowhere.
Economists use the term sunk cost to describe money that has been paid for something and can’t be recovered. Like if a business invests in new software, that software has been paid for and the money’s gone. Even if the software is crap and they wish they’d never bought it, the fact that they spend the money shouldn’t factor into the decision about whether or not they should continue to use it.
On a personal level, the sunk cost fallacy describes how we irrationally cling to things that have already cost us something. Like how if we’ve spent an hour watching a movie we don’t like, we’ll likely still spend another hour to watch the rest of the crappy flick for no other reason than ” well, we’ve come this far…”
We do this because we are emotionally invested in whatever money, time, or any other resource we have spent so far. The trick to stopping ourselves from making poor decisions based on sunk costs is to recognize that even though we can’t get that first hour back, by wasting a whole other hour to watch the rest of that shitball movie we’re just making it worse. Acknowledge and recognize the logical fallacy, and maybe we can spend that second hour doing something we will actually like.
You know, like looking up our horoscopes.
It was Aristotle who said “The more you know, the more you know you don’t know.” This describes one side of the Dunning-Kruger Effect, where experts in a given subject will undervalue their knowledge, because they know enough to know there’s so much more they have to learn. But in looking towards all the things they don’t know yet, they forget the huge amount they do already know.
The flip side of this is that people who don’t know a lot tend to think they know a lot more than they do, which is, sadly, the more commonly seen aspect of the Dunning-Kruger Effect.
Here’s how David Dunning, one half of the duo who described the effect, described the effect:
“The scope of people’s ignorance is often invisible to them… Poor performers in many social and intellectual domains seem largely unaware of just how deficient their expertise is. Their deficits leave them with a double burden—not only does their incomplete and misguided knowledge lead them to make mistakes but those exact same deficits also prevent them from recognizing when they are making mistakes…”
The theory is often put to the test with a variation on this experiment: participants are tested on a task and asked to assess their own performance. Universally, those who score highest end up under-scoring their performance as just above average, while the worst performers always think they do a lot better than they really do. It doesn’t matter what’s being tested — humour, logic, grammar, whatever — it always comes out the same way. As cognitive talent worsens, so too does ‘meta-cognition,’ or the ability to assess ourselves accurately, but as you become increasingly skilful at a task, and begin to appreciate how little you really know, you start to rate your ability less favourably than others.
If you follow the undercurrent of the news, it won’t surprise you to learn that around 70% of people think the world is going to shit. The popularity of slogans like “Make America Great Again” feed into a belief in some abstract golden age in the past where everything was better than it is now.
Even though on just about every measurement that matters, we’ve never had it so good: life expectancy is on the increase, violence is declining, and we’re getting healthier and smarter. There is still suffering in the world, to be sure, but there’s a lot less now than there has been before. Ever.
Yet people always seem to remember things being better than they were, and nostalgia about the past makes us think the future won’t be nearly as good.
Psychology offers us two insights into why people might think things aren’t as good, and getting worse. The first is called the “reminiscence” bump. This describes how people tend to remember the most that happened between the ages of 10 to 30 years, as well as over the age of 60. Those middle years between 30 and 60 just aren’t as clearly remembered. Combine this with what’s known as the positivity effect, where as people get older they tend to experience fewer negative emotions and are more likely to remember positive things over negative things, and the concept of Declinism begins to make sense.
As we age we remember more positive things, and what we remember is from our childhood, so we end up thinking that things were better back in the good old days. Our biased brain tells us that this can only mean that things are on the decline.
We’ve all seen it happen. The local sports team that has always sucked and which people never cared about suddenly has a good season and all of a sudden everyone hops on the bandwagon and is all “Go Leafs Go!”
This is a pretty harmless example of the Bandwagon Effect, but the same cognitive bias that makes it happen — essentially our inability to remain uninfluenced by societal pressure, trends, propaganda, and other outside factors that cloud or otherwise impact our decision-making processes — can also lead to some stupid, regretful, embarrassing, and even dangerous behaviour.
Our tendency to go with the flow isn’t just for sports, the possibility of our accepting a belief increases if a large number of other people have already accepted it. This is akin to group-think, where peer pressure affects decision-making in members of group, but is more like herd mentality. It cuts across all ages, and is something that everyone around the world has in common, to varying degrees. Young people seem more prone to it.
Where it can be seen most poignantly is in politics, where voters start following a particular campaign only because it is popular, and we tend to support those candidates who are likely to succeed. It doesn’t matter what the candidate believes in, it must be ok if everyone else likes it.
Everyone likes to be on the winning team. Even if they’ve always sucked.
One of the biases a lot of us have is the idea that if it’s beautiful it must be good. We meet someone, we like them, so we are prone to think that they are smart and clever and funny and all sorts of other great things just because we had a positive first impression. Our overall impression of someone influences our other judgements on them.
It’s called the Halo Effect, and it tells us some important things. For starts, first impressions are vital. But right up there in importance is that we believe a lot of things with absolutely nothing to back it up.
Celebrities benefit a lot from the Halo Effect. We see them as attractive, successful, and often likable, so we also tend to see them as intelligent, kind, and funny. We might also like their music or acting or political views solely because we first found them beautiful to look at; had we been exposed to the other things first, we may not have been inclined to like them so much. One study even found that jurors were less likely to believe that attractive people were guilty of criminal behavior.
On the flip side, we maybe view a categorically unattractive person as desirable just because we like their art, which explains why the groupies of hideous musicians would do things they’d never do with an ordinary person who looked like that.
The Halo Effect even works through association. When a celebrity we like endorses a product, we will like the product even if we don’t know a single other thing about it.
Source: Halo Effect
Don’t tell me what to do!
We all like our freedom, and none of us like being told what to do. The instant reaction everyone gets to this is called psychological reactance. It’s an unpleasant feeling that results when our freedom to choose is threatened.
It goes further than just not doing what we’re told, but actually doing the exact opposite to prove that our free will has not been compromised. There’s a lot going on here, cognitively speaking, for reactance highlights our need for freedom of action and choice, but also our lust for control as well as a desire to preserve as many options as possible.
If you have kids you may be familiar with the terrible twos. This is the developmental stage where independence is starting to be explored, and every little thing becomes a battle of free will versus just putting your damned pajamas on and getting ready for bed already.
You may have found that reverse psychology can be effective, where you predict the inevitable power struggle and circumvent it by demanding the opposite of what you want in order to get the behaviour you want. But you can only get so far by deception, and it can very easily backfire.
A far more effective way around reactance is to just stop demanding things. Stop being a damned control freak and try to actually work with people. This doesn’t mean tricking them into thinking what you want is actually their own ideas (although this is brilliant when you can pull it off), but involves building cooperation and buy-in from the start by openly inviting contributions and participation.
But hey, maybe you’re just not smart enough to do that. It’s ok, not everyone can be a good leader.
Source: Don’t Tread On Me
Confirmation Bias – We interpret facts to confirm our beliefs.
Once upon a time it used to be that objectivity was something to value in the news. Fair and unbiased reporting was something journalists were supposed to strive for. Nowadays, that’s nearly impossible to find.
Instead, we are inundated with news that comes pre-filtered to jive with a specific world-view. It makes for really shitty news, but it sure is popular.
This has happened because of confirmation bias, where people interpret facts to confirm their own beliefs.
Let’s say someone is robbed at gunpoint. The people who are anti-gun see this as a proof that guns should be banned so people can’t use them to rob other people, while at the same time the people who are pro-gun see this as proof we need more guns to that we can carry them to protect ourselves from people who use them to rob other people. Same story, but two polar opposite biases are confirmed by it.
We think that our beliefs are rational, logical, and objective, but really our ideas are self-supporting. We tend to only pay attention to information that supports what our existing beliefs while ignoring anything that challenges them. We can rationalize anything that supports our existing beliefs, which totally explains things like Fox News and conspiracy theories.
By not seeking out objective facts, interpreting information in a way that only supports their existing beliefs, and only remembering details that uphold our beliefs, we often miss important information that could help us shape more accurate opinions about the world around us.
Source: Confirmation Bias
Do you often walk into a room and feel all eyes on you? Do you know that they’re seeing all the things that are wrong with you, like the stain on your shirt and the pimple on your face and the piece of lunch stuck in your teeth?
Well, the good news it they aren’t. The bad news is you’re seriously self-conscious. But don’t worry, you’re not alone. It’s perfectly normal to feel that way, so normal we have a name for it. It’s called the Spotlight Effect — the sense that there’s a spotlight always on us making everyone else see all our flaws and problems. But the truth is, we all vastly overestimate how much people notice how we look and act. That’s because we forget that everyone else in that room is also caught up with their own problems, and they’re too busy worrying that you’re noticing everything that’s wrong with them to really notice what’s going on with you.
We are, when you get right down to it, egocentric monsters. We exist in the center of our own universes, viewing existence from our own experiences and perspectives. This is how we evaluate the world around us, including other people.
Next time, just remember that those people have no idea there’s lettuce in your teeth, because they are far too busy buzzing around at the center of their own universes and wondering how much you are judging them for all the cat hair stuck to their pant legs.
Source: Spotlight Effect
Bias Blind Spots
Like a blind spot in a car, a blind spot bias is often something of which we’re unaware. We all have them, and the ironic thing is we can easily see them in other people at the same we think we’re the rare person who doesn’t have any.
Whenever someone makes a choice there are a lot of factors that go into weighing the options, so many that they aren’t even aware of some of them. And these unconscious biases — because that’s what they are — impact the choices people make in ways they may not have considered. This is the bias blind spot, and can lead to distorted thinking and bad choices.
Naturally, you can see how this affects other people, but surely it doesn’t happen to you, right?
The desire to see yourself positively as a rational, logical thinker is normal. We all agree that being biased is not the best thing so we all try to make our decisions based on logic. But that’s why they call this a bias blind spot: we’re blind to our own biases. If we knew they were impacting our choices, we’d factor that in and likely choose differently.
The ironic thing is studies show that the more biased you are, the less biased you believe you are. And what’s more, you’re also more likely to ignore the input of experts and more likely to resist efforts to reduce your biases.
Frankly, it’s a miracle we as a species do as well as we do.
Source: Bias Blind Spot