Fundamental Attribution Error
We tend to judge other people on their character, but we judge ourselves situationally. So when we’re late to something – it’s because the world is at fault, but if someone else is late we tend to think they are lazy and disorganised.
Self Serving Bias
We think our failures are situational, but our successes are our responsibility. For example, when you do well in a test it’s because you worked hard, but if you fail it it’s because you didn’t get enough sleep.
In Group Favouritism
We favour the people who are in our group. In recruitment this could be favouritism towards a candidate who went to the same school or university as you, or just showing favouritism to candidates who have more in common with you.
We all love to jump aboard the latest fad – but as more people jump on, the more likely we are going to as well. Be honest… did you spend money on a fidget spinner?
In a group setting it’s our default setting to try and minimise conflict, which means that often we can make irrational decisions if the rest of the group agree.
The Halo Effect
The halo effect is what happens when we take on positive attribute or achievement in a person and this skews our perception of all their other qualities more positively.
Moral luck describes circumstances when a moral agent is assigned moral blame or praise for an action or its consequences even if it is clear that said agent did not have full control over either the action or its consequences.
We like to believe that more people agree with us than they actually do. So next time you say “but everyone agrees with me!” ask yourself what evidence you have to back this up…
Curse of Knowledge
Once we’ve gained a piece of knowledge we tend to think everyone else now knows it too.
We love to believe that we’re the main character don’t we. The spotlight effect refers to our belief that people are much more interested in our actions and appearance than they actually are.
Otherwise known as Availability Heuristic, this is a mental shortcut where we rely on immediate examples that come to our mind when evaluating a specific topic, concept, method or decision.
We don’t want to think that bad things are going to happen to us, so when bad things happen to other people we often attribute this to a false cause so we lessen the perception of threat.
Just World Hypothesis
We believe that the world is inherently just – as a result we can believe that acts of injustice are deserved.
We believe that we observe objective reality and that other people are irrational, uninformed or biased.
We believe that we observe objective reality and that other people have a higher egocentric bias that they actually do.
It’s very easy for us to attribute our personalities to vague statements even though those statements could apply to a large range of people. Our love of horoscopes is a great example of this.
The less you know, the more confident you are because you don’t know what you don’t know. The more you know, the less confident you are, because you know about all the things you don’t know yet.
We rely heavily on the first piece of information that we’re given when we make a decision.
We believe that technology is smarter than us, so we trust it to be right even when we actually know that is not the case. Feel free to correct your phone when it puts an apostrophe in “it’s” when you don’t need it!
The Google Effect
Otherwise known as Digital Amnesia, the google effect refers to the fact that we tend to forget information that is easily looked up on the internet.
We do the opposite of what we’re told – especially when we perceive it to be a threat to our personal freedoms.
We’re more likely to find and remember information that supports ideas we already believe.
Disproving evidence sometimes has the unwarranted effect of confirming our beliefs. So conspiracy theorists believe that evidence has been faked and this makes them believe a conspiracy more.
Third Person Effect
We think that other people are more effected by mass-media consumption than we are.
We judge an arguments strength not by hw strongly it supports the conclusion, but how believable we perceive the conclusion to be in our own minds.
An availability cascade is a self-reinforcing process where a certain stance gains increasing prominence in public discourse, which increases its availability to people and which therefore makes them more likely to believe it and spread it further.
We romanticise the past and look on to the future pessimistically as we believe societies and institutions are in decline.
Status Quo Bias
We want things to stay as they are. Changes to the baseline are often considered to be a loss not a gain.
Sunk Cost Fallacy
We invest more in things that have cost us something rather than altering our investments. This has also been called the “Concorde fallacy” after the UK and French governments took their past expenses on the costly supersonic jet as a rationale for continuing the project, as opposed to cutting their losses.
We think that future possibilities are affected by past events. If a dice has rolled a 6 twice, it is not more or less probable that it will roll a 6 again.
We prefer to reduce small risks to zero even if we could reduce more overall risk via another option.
We can draw different conclusions from the same set of information depending on how the information is presented to us.
We adopt generalised beliefs about members of a certain group despite having no information on the individuals themselves.
Outgroup Homogeneity Bias
We perceive our ingroup to be more diverse, and members of an outgroup to be more homogenous.
We’re more likely to trust information delivered to us by figures of authority. We’re also more likely to be influenced by their opinions.
If we believe a treatment will work then it’s likely to have a small psychological effect.
We focus on things that have survived a process and ignore the ones that failed. So if a strategy has worked once, but failed more times we’d focus on the one that worked.
It’s important to remember that our perception of time is not objective reality. After all, time flies when you’re having fun!
Law of Triviality
Also known as the bike-shedding effect, the law of triviality refers to the way we give disproportionate weight to trivial issues often overlooking more complex ones.
A complicated name for a simple process – we’re more likely to remember incomplete tasks than complete ones!
We place a higher value on any items we’ve partially helped to build. Who knew that’s why you love your Billy Bookcase so much?
Ben Franklin Effect
Essentially – what goes around, comes around. We love doing favours and we’re much more likely to help someone who has previously helped us.
The more people there are around, the less likely we are to help someone.
We, especially children, tend to mistake leading questions for actual memories.
We can sometimes mistake imagination for real memories. Sorry to tell you that time you managed to fly… it’s probably not real.
This is the opposite of false memory where sometimes we can attribute our real memories as part of our imagination.
We like to find patterns and clusters in random data.
We sometimes over-estimate the likelihood of bad outcomes.
We sometimes over-estimate the likelihood of good outcomes.
Blind Spot Bias
Fundamentally we don’t want to believe that we have bias, and we’re more likely to spot bias in others than in our own actions.