
This is the fourth and final installment of a series of articles about cognitive biases and other irrational tendencies—in other words, why and how you err, make decisions, get emotional, and occasionally feel out of control. I recommend reading the previous three articles if you haven’t already; this article will make more sense if you do.
Biases, Illusions, Delusions, Fallacies & Other Cognitive Errs — An Introduction
(pt 4)
Money, and other Self-Serving Delusions
Sunk Cost Fallacy
We become attached to things we invest it. Be it time, money or energy, if we have invested any part of ourselves into an idea, project, relationship, our resistance to letting go can be monstrous—even when all the signs are clear, even when holding on is actually harming us.
Endowment Effect
If we have had possession of something we tend to value it more. An interesting example: if you conceive of an idea, and someone else conceives of exactly the same idea, you will feel your idea is more valuable—just because you conceived of it. A less interesting but more relevant example: if you decide to sell a gifted or purchased possession, you will likely give it a higher than reasonable price, purely because YOU have owned it. Personal ownership of ‘X’ results in higher perception of the value of ‘X’.
Gambling | Einsteinian IQ | Truth Distortions
Deprival-Super-Reaction Effect
We know only what we have when it is about to be taken away from us—be it love, possessions, money, family, friends, property, status, territory, opportunity, or life itself. We will go to extreme lengths to avoid losing things we place value on—and as it turns out, we place a lot of value on a lot of things. Psychological studies have shown that losses have twice the impact on our disposition as do gains (see the Negativity Bias); we don’t like losing anything, full stop, but if the loss is something we value, the effect can be personally devastating—sometimes for a very long time.
Yes, it doesn’t even matter if the thing is something we own or have had a history with; it only has to be something we value, and we humans can attribute value to something very, very quickly. If you follow sports, consider a time when your favourite team (or sportsman/woman) lost either at the very last minute, or against all the odds; and recall the emotions you felt at the time. Very likely, they were horrible. So: How is it, then, that you can experience such painful emotions, when for all practical purposes the result does not directly affect you?1 The answer lies in the value you place on the team/person, which in turn, determines your expectations; and expectations that are ripped apart just when they were about to be fulfilled are—well, just imagine Donald Trump and Kim Jong Un playing ping pong with your emotions as the ball.
The Deprival-Super-Reaction Effect is amplified—and exploited—in gambling. Horse racing is a good example:
Bob bets on the underdog, Billy; should it win, he is in for a nice holiday. The race begins, and against all the odds, Billy bursts past all the other horses, and looks set to take 1st place; at this point, Bob is thinking about two weeks in San Antonio Bay (he bet £5 on a 100/1 horse, after-all). But then Babe, the favourite horse, comes from nowhere and pips Billy to the post… Devastated, confused and hopeless, Bob places another bet…
There are all sorts of forces at play, here—Reinforcement, Negativity Bias, Social Comparison, etc—but perhaps most important is the reaction: just as Bob could taste the glory in his mouth, it was whipped away, and he reacts, emotionally, by placing another bet. He reacts this way because he feels like he has lost £5000 (how much he would have won), when in fact, he only lost £5. Gambling is a very slippery slope.
Just like the gambling industry, supermarkets and brands also take advantage of Deprive-Super-Reaction; they’ll intentionally limit stock, release ‘limited edition’ products, put up offers that make no sense—all to get you to buy more. The Deprival-Super-Reaction Effect goes hand-in-hand with the Scarcity Effect: the shortage of an item or service availability is the deprival phase; the rush once that shortage is fixed, is the super-reaction.
Gambler’s Fallacy
Randomness is actually a thing. How gambling becomes addictive is down to how it distorts perception of randomness; by continually putting in money and receiving nothing for it, a gambler, thinking they are living in a just-world, continue to put in money because, well, it’s only a matter of time before they get their owed reward—right? Wrong. Nobody owes anybody anything when it comes to gambling.
Also at work is what I call illusory omniscience: a gambler, given enough playing time, will start to think he is on to some sort of pattern; through all the random noise he believe he’s spotted a predictable sequence. Of course, there is no predictable pattern—the gambling industry is not stupid. What’s worse, they will continue to update their equations as long as they are not winning! Deluded, he will keep going, updating as he goes along, until eventually he strikes lucky. All too often his reaction will be something along the lines of “See! There is a pattern!”, instead of “Wow, I am so lucky!—best go home before I lose it all!” At this point the wise gamblers2 do go home; most, however, try to apply their Einsteinian equations to the next game, and inevitably, they lose all their money. But this doesn’t mean the equation is wrong, no—it just needs updating, of course!
The gambling industry also benefits, rather tremendously, from the Sunk Cost Fallacy; gamblers will feel more and more compelled to continue putting in money the more they put money in—because they don’t want to lose what they have already put in…
Gambling is a very slippery slope indeed. There is a profound lesson here.3
Apophenia, and Truth
We have a tendency to make connections between events that in truth have no connection whatsoever. It is easy at this point to get into a discussion about truth, but I won’t do that. What is important to know, though, is that people do have different definitions of truth. One person may consider truth as ’that which works’, or ‘that which serves the intention’; this is pragmatic, or functional truth. Another person may consider truth as ’that which is’ or ‘whatever is “objectively” so’; this is materialist, or scientific truth. Religious people typically associate with the former, and scientists, with the latter; again, whether this is what they actually believe is a different matter.4
Despite these differences, we have managed to evolve into highly sophisticated and competent beings, and have, through our cooperation abilities, built a vastly complex and multifaceted civilisation; and the world is still spinning; and life, especially if you’re reading this, is for the most part good. Therefore, knowing whether to take different interpretations of truth as anything but mere hearsay is difficult, because, if they do matter as much as they claim to, it is difficult to see how we would have made it this far, and how we still manage to make it all work today. Whatever this truth we all seem to ‘get by’ with, matters not here; what matters is that we do believe connections exist, where, as far as we know, there are none.
The word epiphany is associated with moments—usually bursts, of insight, realisation and enlightenment; and, most importantly, truth. We do not, however, typically associate an epiphany with false insight— ‘error!’, or ‘stupidity’ are more commonly used. We could also call such falsities apophanies, which refer to instances of apophenia.5
Some events are connected, but just because there seems to be a connection does not mean there is. Mother nature has her ways and cares not an ounce for our ideas of morality, truth, cause and effect, randomness, ESP, or whatever else; she does what she does without giving reason.
Whilst apophenia is seeing false connections, the opposite of apophenia—or rather, its equally troublesome sister—is just as important. Essentially, it means calling ‘coincidence!’ by mistake; better put, it is not seeing connections that are in fact there. Take the phenomena of noticing a car that has recently made an impression on you, maybe you’re considering purchasing a new car—let’s say a yellow Mini. As if by chance, yellow Minis start popping up everywhere you go; what the hell is happening here—coincidence? Well, kinda..
Brace yourself. The event itself is a coincidence, but your noticing it is not; you have started noticing yellow Minis not because God is playing games with you, but because yellow Minis have currently have some significance to you at that specific moment, which rises their prominence in your mind, which causes you to notice more Minis. In other words, your are just noticing more of what was always there!
This is also how intuition works: knowledge becomes almost instinctive when the situation is one we are familiar with, and will make itself known by giving rise to that ‘gut feeling’ with which we are all familiar. Such moments may feel magical, mysterious, coincidental, unexplainable; but there is a cause, and it is no less magical once you understand it.
Perception Illusion
Mistaking perception for reality: this is a problem science and philosophy have lost sleep over since their birth. In a nutshell, it concerns our tendency to think that what we perceive is what actually is. By now, most of the jury is out: our perception of reality is far from actual reality. Nowhere is this more obvious than in the way we perceive people. Consider someone whom you met recently: you probably think you know this person quite well, but this is almost certainly not true; most likely, you know jack.6
What happens when we meet someone for the first time, no matter who they are, is we form an almost complete idea in our mind of who they are; our brain pays special attention to the markers it deems important (skin colour, height, muscle mass, facial expressions, firmness of hand shake, tone of voice, language usage, etc), combines this information with that which it already has (world view, personality traits, goals, initial impression, etc), and conveniently places the person into the category it deems appropriate. We take the small experience we have of their character and automatically form a complete image of who we think they are. Although this image gets updated with every subsequent experience, because we carry this incomplete (complete, in our minds) perception to every subsequent experience it can actually influence how we behave around them, which naturally influences the way they behave, and so it goes on.
An interesting question is, is this discriminatory? Perhaps. But it is our biology. Our perceptive capacities have been an essential part of our evolution from chimp to human, and are, therefore, inescapably ingrained in our DNA. We are biologically ill-prepared for the civilisation we have built and continue to try and build; our ideas of morality, good, evil, truth, competition, and so on—they are not so compatible with how we are wired as human beings. Fortunately though, we have the ability to think abstractly—and this means everything.
Accepting our ill-equippedness for the modern day is the first step to overcoming it. Our perceptual powers are very important, they help us make sense of the world, learn, meet people, empathise, create, survive; but they can also lead us to many unnecessary mistakes, and lie at the heart of irrationality.
Scope Insensitivity
Us humans are shockingly bad with probabilities. This stems from our poor ability to calculate numbers that have little-to-no immediate importance to us; these are usually large numbers, which we cannot represent using our most powerful weapon, emotion. Absent emotion, the only way to properly process ‘unimportant’ numbers (or quantities) is with concentrated thought. Who has time for concentrated thought? Unfortunately, it is the only way; and so you should make time, because this is one of the most crucial and potentially, devastating biases.
Consider the example used in the intro, in which a UNICEF ad asks you for a donation, a small donation that will give X amount of kids clean water; because of our ineptness with scope, an ad featuring one child, persuades us more than an ad featuring ten children–even though the problem being highlighted is exactly the same, or, you could say, worse. This is so counterintuitive that it almost seems unreal; how is it that we do not notice ourselves being persuaded more, or less, by differences in scope? As is the case with most of the biases, even when we know of their effects we still have to work hard to notice them, again and again; Scope Insensitivity is one of the hardest to notice, but one of the most crucial to notice, and then, to overcome. Why? Our ignorance of climate change is one example.
Comparison, and How To Gift Properly
Evaluability
It’s your best friend’s birthday, and you’re unsure what to buy. You could get her favourite perfume, but what if someone buys the same one? You could get her a bottle of wine, but you’ve done so the past 10 years—and the same bottle, too! You want to surprise her, but not with something trivial. You think about it, and two things come to mind: a blouse, and a bottle of wine, obviously. For the former, you’re thinking about a £45 blouse you come across in the catalogue the other day; for the latter, you’re thinking about every other bottle of wine but the one your have gifted her the past ten years. After chatting with your friend and sommelier, Zjohn Pierre, you realise that if you’re going to pick wine, it has to be an expensive bottle—Zjohn recommends a £45 bottle of Cabernet Sauvignon, French, of course.
Now for the question: which gift is better—a £45 blouse or a £45 bottle of wine? To make deciding worse, you know your friend loves that blouse, and loves French wine. Which gift is best? Turns out there is an answer—but it depends on your intention. If you’re interested in what is best for your friend (in this case, her health) you should pick the blouse. If you’re interested in your friendship, and you want the gift to be appreciated (of course you do), you should pick the wine. But why?
The reason lies in our skewed evaluation capacities. Your friend regularly goes clothes shopping, and therefore is more aware than anyone that £45 for a blouse is not really that expensive. She also buys wine, but doesn’t spend much per bottle, because she doesn’t need to—because the wine she likes is only £8 a bottle. She wouldn’t dream of spending more more than £20 on a bottle—unless she was rich. The key, here, is the value your friend will place on the gift she receives, or rather, the value she places on different CLASSES of objects. For her, wine is a relatively cheap product; a blouse, however, could cost a lot, especially if it was very nice. Consequently, if she receives a very expensive bottle of wine, her perceived value of it will be much much higher than that of a blouse of the same price; to her, £45 for a blouse is normal, but for a bottle of wine, well, it is ridiculous!—which is exactly the point.
Generally speaking, the cheaper the class of objects, the more expensive a particular object from that class will appear.
Ignorance of True Norms
Shifting Baselines and Base-Rate Neglect
We have a nature to think we pick things up at the beginning of their life timeline—trends, memes, architectural designs, statistics. Better put, when we come to a new discovery we have a tendency to discard its history and account only for our history with it; this history could refer to our own personal history with it (e.g. we jump on trends we mistakenly think are ’new’) or (e.g. understanding the biases, or species decline). Discarding the historical accounts of anything is generally a bad idea.
The example commonly used in the literature is the issue of declining fish stocks in fisheries, and how the scientists who study them have different ideas of what normal stock levels (the baselines) actually are. A researcher who begins his career in the 1950’s, for example, will have a different notion of the baselines to a researcher who begins 50 years later. Another example is Climate Change: if Professor Donald T. Runt begins studying climate change today, his conclusion of what is normal will be considerably different to those of his grandfather, Professor Donald T. Runt Senior, who studied the field in the early 19th century. To make things worse, baselines may never have been correct in the first place; that is, Donald’s7 grandfather will have had different baselines to his grandfather, and likewise his grandfather, and his grandfather, and so on.8 What this means is that baselines can get progressively less accurate as time passes.
Attention span is another example. The attention span of the average person today, is, some say, much worse than it was one hundred years ago9—technology being the main culprit. Presuming this, the average child born today grows up in a society of short attention spans; he believes this to be normal, but will—and this is the crucial part—like everyone else, get progressively worse over the course of his lifetime also, which means the next generation of children grow up with an even worse idea of attention span. And so it goes on.
Elsewhere in this article I have spoken about our poor capacities for probabilistic thinking, our tendency to want—and find—answers fast, and our inclination to fall for sophistication and extra details; these shortfalls are each nested inside something called Base-Rate Neglect. Axiomatic though it is, it simply refers to our jumping to conclusions whilst simultaneously being ignorant of important facts and/or what those facts mean. Perhaps the most relevant example today is the following:
Fact one: Almost 100% of Al-Queda members are Muslim.
+
Fact two: Almost 0% of people that are Muslim are in Al-Queda.
+
Fact three: Mohammed is a Muslim.
=
David’s Conclusion: ‘Mohammed is in Al-Queda.’
There are, without doubt, other factors which could weigh in on this conclusion—such as David’s exposure to the news, his religious beliefs, whether Mohammed has a beard, etc—but to consider them all is not necessary (see the Single-Cause Fallacy), because the conclusion can be proven wrong by only the facts given. Albeit devastatingly so, Al-Queda represent a nanoscopic percentage of the world. When we neglect the base-rate, we are likely to jump to as inaccurate conclusions as David’s ‘Mohammed is in Al-Queda.’ Facts one and two are base-rates.
Shifting Baselines and Base-Rate Neglect go hand-in-hand, which makes them a threatening combination. For example, two parents bringing up their children today may have derogatory and completely false beliefs about Muslims—they’re totally ignorant of the base-rate. Their beliefs will likely be instilled in their children, who will then grow up with dangerously distorted baselines. This is a micro example, however—and such errors usually get straightened out soon or later.
A macro example, on the other hand, can be devastating—and can be so for a long time. Misbeliefs about Climate Change, Artificial Intelligence and Animal Populations are amongst the most dangerous examples in the modern day, perhaps the most dangerous; what makes them so is not so much the fatal predictions of noted scientists, engineers and philosophers, but the almost blanket ignorance of them as problems worth thinking about. The ignorance stems from cognitive biases and is fed by laziness. If the problem was staring us right in the face—like the 9/11 attacks did, like Cancer can and often does, like winter becoming summer—then we pay attention, and we start making noises, and we start telling others; but more often than not, then, is too late, and irreversible damage is the price we pay.
Ignorance is not the problem; ignorance of ignorance is the problem. Once you know of your ignorance it half removes itself, which is the all-important first step; once you know you don’t know something, it lingers on your conscious, and the only way to remove it is by finding out. You can fight it, but that will only make it linger around longer.
What about forgetting? Barring head trauma or illness, the only awareness of ignorance you can forget is ignorance that has no significance. Climate Change, AI, and Declining Animal Populations have immense significance; once you know of the dangers, you can forget only by taking to your own head with a baseball bat. Don’t do that.
The Future Can Wait
Immediacy Bias
Thinking past our immediate concerns is difficult, because it requires a different thought process to the one we spend most of our time—the one that reminds us of hunger, thirst, uncomfortableness, sexual needs, what we’re doing after this task, what time we must pick the kids us, what to get from Costco, and so on and so forth. In other words, our default mode is one concerned with what immediate matters; and we must make conscious effort to move overcome it. Rationality, long-term thinking, planning, visualising, creating—what makes us human, simply, is our ability to overcome an inescapable default focus on what matters most at this point.
Focusing on the immediate is a good thing when you’ve got crippling pain in your leg and are unable to walk, have a 10,000 word dissertation deadline tomorrow morning and have only 5000 words written down, or have signed up for a marathon next week and have never run a marathon in your life; if something is bugging you to the point that you cannot think of anything else, they may be a good reason for it. On the other hand, sacrifice and painful endurance are virtues in the world we have created; the goals and missions that we set for ourselves and as a society mean that we have to get into situations and states that don’t come naturally to us, and that don’t have any immediate benefit, and, sometimes, that don’t have any obvious benefit at all. Unfortunately, nature doesn’t care much for our long-term thinking; in these uncomfortable, sometimes painful situations, we’ll be constantly reminded of what our body really wants, and what could be achieved if we just stopped making it so hard for ourselves—and just eat that donut, have that fling, sleep in all day, skip that workout, etc.
Plan your life or life will plan it for you. Sitting down to make a plan for your life, for your week, for your day, isn’t easy—there are always more ‘important’ things to do—but you have a much better chance of achieving what you want to achieve if you do. Much of who we are and who will go on to become has been determined already—by our biology, upbringing, tastes, etc—but there is still some ounce of our destiny that we can change, however small it is, and it can only be realised if we take responsibility for it. Make a plan; don’t worry making it correct or perfect—that can come later, and besides, who says you can’t edit it further down the line, when new opporutnies arise, or your desires change?—what matters it that you make one.
Why Your Colleagues (or you) Don’t Work Hard
Kakonomics
A fascinating phenomena that is most evident in the workplace—specifically team environments. Consider the following situation:
You and your work colleague, Jimbo, both report to the same boss; you popped over to chat with Jimbo about something you are unsure about, and it suddenly becomes starkly obvious to you how slow, inefficient and unproductive Jimbo is at his work, and, what’s more, he is unable to help you; you ask another colleague and discover, again, that they, too, are slower than you. That night you think about today’s events, and how hard you have been working since you started (which was only recently), and come to the conclusion that you are no longer going to work as hard, because, well, it’s not right that everyone but you should be able to slack off, and besides, you are only doing it for the money anyway. For the next few months you work slower and less productively, and even chat with Jimbo about how ‘”easy” the work is’; and Jimbo does the same with his friends, and them with their friends, and so it goes on… The result? Progressively degrading work quality throughout the team, maybe even throughout the company.
This is Kakonomics.
What would happen if Jimbo started working harder all of a sudden, and the others noticed? Well, Jimbo would then be a threat—because his hard work may ‘show the others up’. One of two things will then happen: either others step up their game, though fear of being exposed or outshone; or they reject Jimbo as an outcast and rationalise their nastiness as a problem on his part, when really, they are punishing him for his hard work—because if he works hard it means they have to work hard, too.
Kakonomics is the result of lack of incentive—i.e. no real reason to work hard—and once it takes hold, can only be eradicated by the introduction of better incentives, or personnel change.
It all relies on the following line of thought:
“As long as they aren’t working hard, I don’t have to work hard”,
“but if they start working hard, I also may have to start working hard”,
“—so I will make it clear to them I’m not working hard, so they, too, do not feel compelled to work hard.”
All it takes is two people with this attitude to create an epidemic. Moreover, this phenomena is unfortunately not limited to the workplace; it exists everywhere where effort is required—and what that matters in the world does not require effort? Need some examples? Look to the health, fitness, business, relationship and financial aspects of life—starting with your very own.
When Biases Team Up
Cognitive Dissonance and Lollapalooza
Cognitive Dissonance refers to the unsettling state that results from holding two or more contradictory beliefs or opinions. Whilst we can conjure up this state in ourselves—through exposing ourselves to opposing viewpoints and differing beliefs—it never fails to make itself known in debate environments, arguments, and in any conversation involving strong opinions.
The truth is that we all hold contradictory beliefs of one sort or another, but we do a very very good job of burying, denying, and when need-be, defending them.10 Nobody likes being proven wrong or having their irrationalities laid bare, especially in front of others, so the uncomfortableness is completely normal—and unavoidable. The problem is that Cognitive Dissonance can get out of control if the person fights or ignores it. Inexperienced debaters, politicians and dogmatists, for example, will react badly to someone picking holes in their opinions, usually, by holding onto them even stronger—a polarisation effect.
Charlie Munger calls this Lollapalooza. It refers, quite simply, to bias-hysteria: when there are several or more serious cognitive flaws at play, to such a degree that makes it difficult to pinpoint any particular individual flaw. Cognitive Dissonance and Lollapalooza go hand-in-hand.
Conclusion: ‘We’, and Some Remarks on Science
Throughout this discussion, ‘we’ has been the primary pronoun, for which there are a few reasons. First, without sounding flowery, we are, inescapably and undeniably, all in this together; every human to ever live and that will ever live, has, in some distant biologically undetectable manner, the same common ancestor. For all our independence, we are inherently reliant upon one another. Without each other we are nothing. We are all drinking a potion from the same well; a potion that inspires as much life as death; a potion that over time becomes not an elixir but a poison; a potion that we have no choice but to keep drinking. We are all headed for the same destination; we may take different paths—some shorter, some longer, some treacherous, some smooth—but eventually we all end up in the same place.
On a lighter note, the second reason for my pronoun choice is that most scientific writing turns me off. Many scientists or science writers are not interested in telling stories, using analogies or digressing; in fact, any unecceasry discussion, poeticness or storytelling is, as far as I can tell, actively discouraged. The problem is, tangents, stories and first person narrative are powerful explanation tools; they give the reader different perspectives to play with, which not only improves the reading (or listening) experience, but singifncantly aids memory. For example, addressing ideas in a roundabout way can be found in the works of Carl Jung; unquestionably, this makes his work much harder reading, but it also makes it much more memorable, interesting, and symbolic. An adage comes to mind here which I think has deep relevance, and is one, I am sure, that you are more than aware of, namely, “The more you put in, the more you get out.”—there is a reason novels have such cultural significance.
Shakespeare, Homer and J.K. Rowling aren’t loved because of their knack for logic and rationality, but because of their ability to bring the page to life, because they give their writing spirit! There are some Science Fiction writers who have done a brilliant job of merging storytelling with stone-cold science: Neal Stephenson, H.P. Lovecraft, Isaac Asimov, H.G Wells, Frank Herbert—just to name a few. As gifted and profound as our favourite sci-fi authors are, they nevertheless cannot bear the huge responsibility of brining science and the world of ideas to the masses—especially at the pace information flows in the modern day.
What is needed in the field of academic science is not more straightforwardness, stone-cold logic, rationale, and data—of these we have enough—but far more engaging, personal, digressional, story-based, roundabout—even poetic—writing. We cannot do science without the former; but mins the latter it is boring, monotonous, cold, grey, stodgy, and frequently tedious. A scientific paper or book may be scientifically very deep, absolutely true with its generalities, 100% accurate with its specificities, and may be discussing a very very important idea, but unless it comes across in an engaging manner, to the masses it means virtually nothing; it may go down well in the scientific community—if the ideas are important enough—but the likelihood of it going down well outside science is virtually non-existent. Of course, a scientist may have no interest in his ideas gaining popularity outside of science, however even he will benefit from up-taking a more compelling style of writing.
I am not saying discard the rationality and prose; rather, add to it, give it context, tell stories, go off on a tangent, use the first person, use personal anecdotes. When writing or speaking lacks spirit11 it makes almost no impression on the recipient; in order for an idea to catch on, to make a difference, it must must must have meaningful context, relevance—life! As impressive our cognitive abilities are, we need a framework to attach ideas too; there are many major and game-changing ideas out there, but because people lack the framework with which to attach them to, they go in one ear and promptly out the other. Injecting narrative, story and digressions into scientific writing will not only make it more interesting and valuable, it will also benefit the writer himself; in the act of digressing the communicator is constructing within themselves new layers of understanding, they reach unfamiliar angles of perspective, their understanding deepens.12 It should be noted, though, that what I am suggesting here—which some would call better usage of persuasive methods—can be used for both good and evil; in this sense, we should be grateful transfixing communication isn’t so easy.
The esteemed psychologist, Abraham Maslow, finished his book Toward a Psychology of Being with an interesting piece about this very matter, and one with which I concur completely. Here are some quotes from that piece:
‘…Since it is my custom to think on paper, I have the whole thing written out. My temptation then was to throw away the rather professorial paper I was preparing for this meeting. Here was an actual, living peak-experience caught on the wing, and it illustrated very nicely (“in color”) the various points I was going to make about the acute or poignant “identity-experience.’
‘And yet because it was so private and so unconventional, I found myself extremely reluctant to read this out loud in public and am not going to.’
‘However the self-analysis of this reluctance has made me aware of some things that I do want to talk about. The realization that this kind of paper didn’t “fit,” either for publication or for presentation at conventions or conferences, led to the question, “Why doesn’t it fit?” What is there about intellectual meetings and scientific journals that makes certain kinds of personal truth and certain styles of expression not “suitable” or appropriate?’
‘The answer that I have come to is quite appropriate for discussion here. We are groping in this meeting toward the phenomenological, the experiential, the existential, the ideographic, the unconscious, the private, the acutely personal; but it has become clear to me that we are trying to do this in an inherited intellectual atmosphere or framework which is quite unsuitable and unsympathetic, one which I might even call forbidding.’
He continues….
‘It is obviously foolish to try to do the work of personal science in a framework which is based on the very negation of what we are discovering. We cannot hope to work toward non-Aristotelianism by using a strictly Aristotelian framework. We can not move toward experiential knowledge using only the tool of abstraction. Similarly, subject-object separation discourages fusion. Dichotomizing forbids integrating. Respecting the rational, verbal, and logical as the only language of truth inhibits us in our necessary study of the non-rational, of the poetic, the mythic, the vague, the primary process, the dreamlike. The classical, impersonal and objective methods which have worked so well for some problems, don’t work well for these newer, scientific problems.’
He even goes on to remark how he believes the very issue about which he speaks is better conveyed in a series of cartoon sketches by Saul Steinberg! It is quite something for an academic of this esteem to say such things, and it highlights precisely the problem I have been discussing here, and which I will say again: Most Scientific Writing Is Dull.
He does, however, end with a word of caution about how too much free-reign could be counter-productive:
‘Most difficult of all, however, judging by my own inhibitions, will be gradually opening up our journals to papers written in rhapsodic, poetic or free association style. Some communication of some kinds of truth is best done in this way, e.g., any of the peak-experiences. Nevertheless, this is going to be hard on everybody. The most astute editors would be needed for the terrible job of separating out the scientifically useful from the great flood of trash that would surely come as soon as this door was opened. All I can suggest is a cautious trying out.’
To bring it back, the reason I have used ‘we’ should now, I hope, be quite evident. I could have used ‘you’, but that may portray the sense that the biases and shortfalls of which I have spoken are only affecting you, or a short portion of the population—but this couldn’t be further from the truth: they affect everyone. I could, alternatively, have been more technically abstract and used words such as ‘humans’, ’the brain’, ’sapiens’, or the like, but this would remove the personal and contextual element, which, as I have just discussed, makes an immense difference on memory and understanding. ‘We’ works for me.
*
You probably saw a number of connections in the cognitive flaws above, some probably sounded repetitive; this was unavoidable given how many psychologists and social scientists have discovered, and, how many of them have everyday significance on our lives. I have explained some tendencies, such as Shifting Baselines and Base-Rate Neglect, in the same passage; and others, such as Social Comparison and Jealousy/Envy, I have explained separately. For the former, I did so because either their explanations compliment each other, or they are uneasy to disjoin; and for the latter, because I regard them as either having important dissimilarities, or deserving of their own clarifications. Some passages, however, are slight variations, children, or opposites of others; and there is a good reason for this: there are a measureless number of different situations we can find ourselves in, each having their own cognitive requirements, and, consequently, irrational expressions.
Conclusion: A Bias-Free World?
Illusions, delusions, flaws, fallacies, biases, errors: whilst these are, by technical definition, irrational tendencies, they are fundamentally deep-rooted in the structure of being, which means they are fundamentally deep-rooted in the world we have created. Our cognitive shortcomings have helped us make it this far along the evolutionary timeline, and, as I always like to point out, evolution doesn’t stop. The difference today lies in the way evolution happens: up to this point, Natural Selection (nature selecting the organism which can reproduce most successfully) and Mutation (variations in the organism that increase its fitness and adaptability, making it more capable and successful at reproducing) have been the two principles driving evolution; today we intelligent humans have added another, multi-faceted principle: our ability to cooperate, modify biology, manipulate our environment.
It’s easy to think this third principle is free of bias, irrationality, illusion and error, and maybe even that it transcends evolution, but in my opinion, this is a massive mistake; evolution is the intangible and irrepressible thread that runs through everything everything we know about life, from bacteria to adult humans, from the sun to the galaxy, from the universe to everything outside it, from everything that ever was, has ever been, and that ever will be.
Our flaws, despite the pickles they may get us in—as individuals or as civilisations—matter, and they forever will. Even if with our technological innovations, we transcend our human form, whatever results will still be susceptible to error—because it will be our creation. There are arguments against this view—for instance, that we are building computers much smarter than us, that, because they will be self-modifying, will rise above everything that today prevents us from being 100% rational actors—but none of them are convincing. The upshot is, we cannot debias completely, and nor should we try to. An utterly bias-free person would, I imagine, not just be extraordinarily boring and tedious, but not human. Go in to your garden and find an ordinary-looking pebble: this is about as interesting such an individual would be. Absolute dispassion, complete impartiality, utter neutrality, categorical objectivity, total detachment: these states have no spirit, which means they have no life, which means they are dead.
You may be tempted to ask at this point, why learn about biases, then, if we cannot remove them, and if they matter so much? It helps to consider our relationship with music: namely, that music not understood sounds chaotic, abysmal, and useless; but understood, it is deep, beautiful, and can be life-changing. It is through this understanding of music, and of sound and rhythm that we are able create music in which we’re able to find beauty, strength, and meaning. Albeit less so, the same applies in film and theatre; these forms of art can be entertaining for reasons we don’t understand, but when we make an attempt to understand them their relevance can take on a different meaning.13
Take a film that you really like, but don’t really know why—perhaps it made an impression on you in your younger days, perhaps it made you cry—and rewatch it, but this time do so with a set of questions: ‘What is this film trying to say?’, ‘Why did the writer/producer/director make this film?’, ‘What can I learn about human nature?’, ‘Are there any other profundities that I am not seeing—or that the producers don’t even know about?’ and ‘Why exactly does this film impresses me?’ Do this and your appreciation for the film—and other films you watch—will take on a different meaning; you will look at films entirely differently in the future.
I digress. Becoming aware of why and how you, and every other person on the planet is irrational and error-prone automatically gives you a leg up in life: you’ll be more accepting—and necessarily so—of the trivial events that arise day-to-day, but that have no significance until you give them significance, which, of course, often means they escalate (e.g. an argument over who has the remote, or who should do the shopping, or with a bad driver, or with your children…). Learning about how and why humans are flawed creatures will make you a better communicator; you will understand why people do the things they do; you will understand more of the world you live in, and of crucial subjects like history, politics, the market; and most importantly, you will become a better thinker.
Kierkegaard once said ‘You become what you understand.’ Fight long enough with the cognitive shortcomings discussed above until you understand them deeply. Once you understand them deeply you will literally will not be able to stop yourself noticing them in yourself and in others. When you notice them in yourself you can stop, and you can think again, which can make a profound difference—in any moment, of course, but especially those in which the stakes are high, such as a business deal, or relationship crisis. It works the other way too: knowing how vulnerable humans are to irrational tendencies, you can question before your see them; better put, instead of relying on your knowledge to help you identify flaws, you can consciously look for them, in the reasoning of both yourself and others.
Whilst you can be as ruthless as you wish with your own flaws, you do so at your peril when it comes to the flaws of other people. You may be spot-on in your identification of their errors, and you may have the perfect solution, but unless they’ve read this article, or are open to critique, they most likely will take offence; therefore, unless you want a busted lip, a toned-down approach is necessary. For example, if you have identified flaws in the rationale of your boss, it probably isn’t a good idea to lay them bare on his office table; better options include weighing up the consequences, letting him remain irrational, leaving the job, slipping your critiques between two piece of praise (for which the hyper-technical term is ‘criticism sandwich’), or asking someone closer to him to do it. When pointing out flaws in the reasoning and logic of others, context matters; make sure that you give examples, your own valid reasoning, and preferably, solutions. Remember also that you could indeed be wrong, so in addition to solidifying your own reasoning, make sure you remain open to criticism. The advancement of civilisation is fundamentally dependant upon criticism of ideas.
Conclusion: The Role Of Belief
The imperfections in cognition pointed out in this article are, I believe, by no means comprehensive; there is undoubtedly a great score of other influences that affect how we we think and act—some that we know about, some that we don’t yet know about, and some that we can never truly know about. Tons of books and articles have been written on the subject at this point, and tons more will be written in the future—perhaps disproving everything we know today. But that is unlikely. More likely, what we understand today will be that which tomorrow’s understanding is built upon: this is generally how progress in knowledge occurs.
To shine light on one example, I did not discuss in this article the role of religion, or how we think about God. There is good reason for this: such a discussion is bottomless, controversial and the antithesis of straightforwardness. Thankfully however, a few brilliant thinkers have already shouldered the burden of such an undertaking and written some fantastic books on the subject of God and the role of the many religions.14 Still, a few things are worth mentioning.
The beliefs that we hold are, literally, who we are; this is another variation of the Kiekerggard quote I gave earlier, 15 and it refers, quite profoundly, to how the highest ideas we hold act as guiding stars as we plough our way through the chaos and struggle, through life. There is a connection between religion and the Doubt Removal Tendency is interesting to think about: namely, that the reason we humans created religion in the first place, and why it has had and continues to have such an momentous role in our history, is because it allows us to quell doubt; it provides a mechanism with which to satisfy our needs for certainty about the cosmos, death, morality, good and evil, meaning, pain and suffering, consciousness, and all the other facets of existence that are apt to make us worry. The stories about the afterlife, Noah, Allah, Heaven and Hell, Adam and Eve, Karma, Purgatory, Sin—whilst they may not be true in the scientific, objective sense, they have been and continue to be true in the pragmatic sense; humans originally created these stories and fictions to decrease their uncertainties about, well, everything, and as they developed, their real value began to show: they could be used to find meaning, to encourage good, to form coalitions, to make sense of struggle and suffering, to find reason in pain, to see the value in sacrificial practices, to establish order. That they are not factually true doesn’t matter; their value lies in how they help adopters live good, meaningful lives.
We are slowly moving on, here, to the central problem of new-age atheism and a large proportion of Western Philosophy, namely, their failures to actually help people in any way, and to solve the central problem of existence: meaning. Why people cling on to religion today, in spite of it’s obvious falsehoods and many culturally ancient practices and ideologies, is precisely because of meaning; religion gives them purpose, keeps them sane, with it they are able to make sense of life, with it they are enlightened. Enlightened is a big word, and maybe the wrong one; but when you think about where many people would be without religion, it is precisely the right word. ‘Enlightened’ is not a term regularly used to describe atheists, new-agers, agnostics, and illuminati-fans; more often uses is ‘confused’, ‘depressed’, ‘lost’, ‘uncertain’, ‘dogmatic’ or ‘deluded’. This is not to say meaning cannot be discovered outside of religion and ideas about God, no; there are certainly a fair share of atheists and non-believers who care nothing for religion, but get by just fine, and perhaps even better than if they were religious—perhaps they used to be religious, but in realising its flaws, are no longer, and feel much ‘better off’ for it. Scientists, teachers and dedicated practitioners of meditation, and philosophers are good examples of the types of people who live good lives as non-believers. The obvious problem with this is that most people are not scientists, philosophers, or meditators—and this is exactly the problem.
You see, it is not that a disbeliever in God or religion cannot live a beautiful, rich, meaningful life; they can, but unless they are very lucky they will need to completely restructure their belief systems, rethink their ideas about the world and reality, reorganise their lives around mindful practices like meditation and exercise, they will, literally, have to turn their inner world upside down; and the fact of the matter is, most people are either not intelligent, patient, caring or interested enough to do this, especially those who believe their beliefs are serving them just fine. Religion and the concept of God are the glue that holds civilisation together, the brackets upon which our world lays; the alternative is a world we irrational and still-young humans are utterly ill-equipped to handle; the alternative, as far as I can see, and as modern-day horrors repetitively show, is chaos, is nightmaric, is hell.
As pessimistic as this sounds, there is much reason to be optimistic. More and more people are not abadoning religion or the idea of God, but adopting what they find useful and dropping what is not; and are finding much of the personal benefit they got from religion through other practices, such as mindfulness practice, mediation, and yoga. For thousands of years going against religion in any way was a punishable offence; today it is not only legal but encouraged, and, if done properly, can be life-changing. There are many possible reasons for this, science being one (strong) possibility—for it is through science that we have been so mercilessly confronted with the many falsehoods of religion; further, the scientific method is built on top of rational thinking, and rational thinking is the tool people need to find what is useful and discard what is useless.16
Here is the thing: the highest idea you hold is your God. Therefore when God no longer is, when, as Nietzsche said, ‘God is dead’, what is left is not some higher purpose or deep sense of meaning, far-reaching desire to do good and to love thy neighbour; but uncertainty, purposelessness, despair, anarchy, which are themselves strong symptoms of looming rebellion, totalitarianism, and evil. Any previously sacred believer of God will know all about the deprivation, emptiness and floundering that follows the exorcism of a strongly held belief—and there is arguably no stronger belief to hold than that of a master creator, of a miracle worker, of a God or group of Gods, of a Holy One, of a being superior to all other beings.
Tying together the scientifically recognised flaws in cognition with the influence of the beliefs we hold is not an easy task; however I do think this is something the individual has to do for themselves—that is, they must think deeply about their beliefs and errors are influencing how they live, and make changes where necessary, however painful. If this answer does nothing but increase your uncertainty, good!—let this be your inspiration to do your own investigation. Yes!—analyse your every line of reasoning you offer up for any irrational tendencies and misbeliefs and determine whether you are hindered or helped by them, that is, whether they actually matter or not; and go through what you believe about the world, your place in it, and about others, and identity how these beliefs are controlling you, and if need-be, modify; and keep yourself on sharp guard for irrationality and harmful ideas—yes, in others, but most of all, in yourself.
Our highest ideas are our Gods; we do not have hold of our beliefs but they us; ideas and beliefs are not our possessions, but our masters. Believing in the importance of understanding the biases and other flaws in human thinking is also a belief that will control how you think and act; but it is, for all practical purposes, a belief that can only serve to help. In this sense, helpful beliefs are not so much our masters, but our ever-present guardian angels; harmful beliefs, on the other hand, have us as their submissive slaves.
(footnotes below)
Footnotes
- unless you are a shareholder, of course
- There is really not such thing as a ‘wise’ gambler.
- Don’t gamble.
- Most people don’t know what they believe. Asking somebody what they believe—and why and how their beliefs serves them—will very rarely result in an accurate response; unless the individual has done a lot of thinking about their ideas of truth and belief, they are apt to be wrong. Why? Read about cognitive biases—there are lots of them. It is a good idea to articulate, to yourself, what truth means to you and why; and, how your beliefs serve you and why you have them.
- Technical definition of apophenia: ‘A process of repetitively and monotonously experiencing abnormal meanings in the entire surrounding experiential field.’
- Not the name Jack; the other jack.
- Donald, if you’re interested, was the first name that sprang to mind. I don’t know why.
- If the study of climate change was as old as humans…
- I agree with this only so far. Talking about attention span without context only tells you so much—and not a lot. I, for one, can focus easily on some tasks, whilst in others I feel totally agitated. It piles down to interest.
- This is why we can never be 100% rational.
- By ’spirit’ I mean, life, interestingness, engagement, energy, relevance, context.
- Another way to think about this is like building new neural pathways inside your brain—building new bridges between input and your conscious understanding. This is not just a pertinent metaphor, neither; when you learn something new your brain literally physically changes.
- Music is much easier to create without the creator actually having a clue about what they’re doing; whereas the producers of films tend to have a reason, that is, a deep understanding of the message(s) they’re trying to put across. But it even works when the creator has no other intention than entertainment; meaning can be extracted from many different places, and entertainment is no exception.
- A few examples: Maps of Meaning, Jordan Peterson; The Beginning of Infinity, David Deutsch; Antifragile: Things That Gain From Disorder, Nassim Taleb; The Moral Landscape, Sam Harris. I by no means agree with everything said in these books, but with most I do; but the main value lies not in whether one agrees or not, but in the development of one’s own understanding; and these books are examples of writing that does just that—by opening you up to whole new worlds of perspective.
- ‘You become what you understand.’
- As Bruce Lee liked to say.