The human mind is a remarkable organ. It’s can be precise beyond belief, yet prone to disastrous failures of cognition. We process information at lightning speed, but in doing so have developed some mechanisms that lead to failures of cognition.
A common goal in the floating community is to become more self-aware. The modalities we use are our main weapon in this task. But wouldn't it also be nice to understand how our minds fail? Thanks to current research we can.
For a long time, a rationalist model of the mind prevailed. We believed that, given the choice, humans would act with rational self-interest. Today, it's understood that this isn't the case. We don't always or even usually act rationally. And we often self-sabotage our judgements using oversimplified heuristics that fail us.
What are Heuristics and Why do We Use Them?
A heuristic is, "any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals."
We regularly and subconsciously use these shortcuts. And it's worth asking why, considering how much flaw they introduce into our problem solving. Well, as it turns out, processing information, thinking, and problem solving are difficult. These mental processes are far more difficult than we intuitively understand. To say it another way, these are cognitively demanding tasks.
Psychologists have figured out that we use two different types of thinking. We can imagine these as two separate systems used for different tasks.
The first system is fast beyond belief and works automatically. We don't have to turn these capacities on. They work without our intentional engagement. For example, if someone were to ask you the answer to 2 + 2 you'd say 4 immediately. You wouldn't have to stop and think to arrive at the correct answer.
You're so quick and adept at answering this question that you could do it while cooking dinner or walking in the forest. You'd answer immediately without breaking stride. We use this first system far more often than we do the second system.
The second system is slow, deliberate, and mentally taxing. In fact, we can't even use it unless we're focused entirely. For example: instead of asking you a simple question like, "What's 2 + 2?," what if I asked you, "What's 263 + 845?"
Most adults would be able to answer this question. But, most, if not all of us would have to deliberately think about the answer. We'd actually have to stop doing anything else for a moment, and bring all of our cognitive capacity to this question. And, if someone interrupted us while trying to answer the question, we'd have to start over to solve the problem.
And yes, all of this has been studied and proven by researchers. There's so much demand on our mental capacity to answer a question using the slow thinking system that we can't even walk while thinking it through.
The point is this: using the second, slower system requires effort. Effort uses energy. In our distant past, energy was hard to come by. It appears that in our evolution we developed ways to avoid using so much effort.
Which brings us back to heuristics. Our minds have developed little mental pathways to help us avoid difficult, second-system work. Rather than engaging the slow, deliberate thinking system, we use shortcuts. These save energy and generally work but aren't perfect.
And as we know now, are themselves the source of much cognitive error.
There's no way to live using only the second, slower system. That's not the point here. The point is the modern world is incredibly complex, and most of us could get far better results in decision making and judgement. But to do that would sometimes require using our slower system.
Here's an example: In ancient times, any person in a group not our own was a danger. A heuristic saying, "avoid people from other groups," was probably beneficial to our ancestors. But it's easy to see how this mental shortcut isn't helpful in today's complex society.
It's a failed heuristic. And when heuristics fail, we call it a cognitive bias. Wouldn't it be nice to employ the slower thinking system to overcome such cognitive biases where possible?
It would be powerful, of course. But it's important to note that the first system, with its immediate response time isn't a bad thing. In fact, it's mostly good. We couldn't live if we had to engage our slow-thinking system before every decision and action.
We need these heuristics to survive. But wouldn't it also be nice if we could turn them off and use our slow-thinking capacity when suitable? "Turning it off" is the hard part. We use these heuristics subconsciously, automatically, and regularly. No human will ever completely overcome these heuristics, nor should we.
But with self-awareness, we can at least begin. We can question some of the automatic judgements that aren't serving us or the world. We can, with awareness, insert deliberate choice where it would benefit us.
So, let's take a look at some of the most common ways that our heuristics cross the line into cognitive biases. Let's look at a couple of the (many) ways our minds fail us.
Current Moment Bias (AKA: Hyperbolic Discounting)
You'll recognize this as one of the most common ways we sabotage our own growth. Any seeker after self-awareness, optimization, or a better life knows that intention and action are sometimes at odds.
Self-improvement isn't a straight line. We sabotage ourselves in many ways, but one of the most common is via the current moment bias.
We know that doing something difficult now will provide benefit long term. Yet, we struggle to take this beneficial action, because we're constantly seeking immediate gratification. We have a bias to the current moment.
Take the example of investing effort in a daily meditation practice. Meditation has been shown to provide lifelong benefit. But in the moment it feels difficult, other tasks seem more pressing, and you can get more immediate benefit from any number of things.
Studies on Current Moment Bias show that most people will take $100 now instead of $110 tomorrow. Yet, if you ask them to choose between $100 in 30 days or $110 in 31 days, they will then choose the former.
The difference is exactly the same, waiting one day longer for 10% more money. But in the current moment, people choose the lower amount because of our bias. It feels like it's worth more now than later.
To improve our lives, we must learn to overcome Current Moment Bias. We must learn to delay gratification to build ourselves a better tomorrow.
It's widely known that legacy news media is a bloodbath. The old maxim 'if it bleeds it leads,' still holds true.
Those who took part in the social media revolution believed something new and better was coming. We believed that by interacting with other humans we could mitigate the legacy news media bloodbath.
But something interesting happened. It turned out that outrage was the most viral emotion. Today, social media has devolved into a nightmare show. The carnage and tribalism is more pronounced on social media platforms than anywhere, including legacy news media.
'If it bleeds it leads,' isn't a clever idea invented by an evil media genius. It's part of the human condition.
Studies have shown that if two things (emotions, thoughts, social interactions, events) are of the same intensity, the more negative will always be more memorable.
This tendency is a throwback from our evolutionary past. And it makes sense if you think about it. A whole, productive day gathering berries is pretty much wiped out by a single lion sighting. You want to be able to remember dangerous things. Knowing the whereabouts of a pride of lions is information that could save the entire tribe tomorrow.
But in our current society this negativity bias isn't leading to a positive result. Does it really benefit you to know that someone across the continent was horrifically murdered?
Negativity bias states that once you hear that news, you'll focus on it. Meanwhile you've never seen a murder in your life. Nor have you ever had to defend yourself against a murder attempt.
It's very likely that negativity bias is making our lives worse.
Have you ever met an attractive, charismatic, or successful person and then put them on a pedestal?
The truth is you have done this. We all have. We select leaders, for example, often based purely on body language or convincing stories.
For example, are tall men necessarily the most competent people in the world? Not at all, but tall men comprise the bulk of our CEOs and successful politicians.
Attractiveness, charisma, and/or success produces a subconscious halo effect in our minds. We see these positive traits and believe these individuals are good in other domains. And we do it without ever stopping to think.
One famous example was Tiger Woods. He's incredible at golf. But that's the only rational conclusion we can make about him.
We never had any evidence to show more than that. Yet, most people believed much more about him. We believed he was wholesome, a family man, mentally healthy, and more.
After his fall from grace some years ago, it now appears that he was none of those things except an incredible golfer. But the Halo Effect had us believing them all.
When presented with a set of information, we'll respond most to the first and last piece.
For example, when shopping for for a used car you'll respond to the sticker price. The actual value of the car might be much lower, but most people will negotiate from the sticker price. It's the 'anchor.'
The same is true of experiences. One positive way to think about anchoring is the example of group fitness training. Studies show it's more effective than training alone.
This, in part, is due to anchoring. At the beginning of class you show up and socialize with new, interesting people. At the end you share the camaraderie of having been through a difficult experience.
Soon you find yourself returning because you're anchored to these positive experiences. Meanwhile the entire middle of the experience was difficult and painful.
This means we over-emphasize what's immediately available to our mind.
Donald Rumsfeld, former United States Secretary of Defense, famously said, "there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know."
It's a lot to swallow. But, Rumsfeld was trying to find a way around the availability heuristic. He was pointing to the danger of coming to conclusions with incomplete information.
What if, for example, your friend was setting you up on a blind date and told you the following, "He/she is a 10/10, an absolute stunner. He/she has a golden personality. You will talk, laugh, and have endless fun."
That information becomes what's available to our mind. So, we begin making judgments based on that availability.
But, it's not that much information. What if the person is also a serial killer? Or, what if at the end of every date they soil their pants? Would you still date them?
The point is that we prioritize the information available to us. But often it's the information that isn't available to us that's most important.
Of course, complete information isn't always, or even usually, available. And when it's not, we rely on what little we know. In doing so, we assume that a certain outcome is more likely than it actually is.
This is another example of where social and traditional media confuse us. Because we're exposed to certain dangers they are more available to our mind. So, we spend much time thinking about and protecting ourselves against unlikely outcomes.
Most of us are unlikely to ever face an active shooter or other terrorist attack, for example. But car accidents and heart disease are real, common problems. But, when was the last time you saw a divisive argument on Facebook about car accidents or heart disease?
The far less likely outcomes (active shooters, terrorists) are available to our minds, so we focus on them.
The World of Cognitive Bias is Vast
The five cognitive biases mentioned in this article are a mere scratching of the surface.
There are dozens if not hundreds already identified. Researchers are discovering and studying new ones as we speak. To reiterate an earlier point, we can't escape cognitive bias and the use of heuristics. Trying to do so would result in total paralysis.
But in your quest for better self awareness, this knowledge of cognitive bias might aid you. Used in combination with modalities like floating and meditation, you'll see patterns. Knowing how you've fooled yourself in the past will help you un-fool yourself in the future.