Economics has a lot to do with studying our, businesses’, or nations’ decision-making in various settings, natural or experimental. Finance has a lot to do with making decisions about investments, savings, and allocating capital under uncertainty and across time. Accounting has a lot to do with providing a way for us to understand whether those decisions had good or bad outcomes.
Hence, in order to properly understand the basics of the above-mentioned disciplines — and the multitude of places in practice and theory where they overlap, and where the lines between them blur — I believe we need to understand more about decision-making in and of itself. This is also incredibly valuable in everyday life.
Let us start with what a good decision is. And let us do that with the use of an example a Cambridge professor once used in a management class when I studied there. Funny enough, the example he used is one from Swedish history.
(Picture from Warfare History Network)
After a long and grueling march, the Swedish forces reached the outskirts of the Estonian city of Narva. On November 20, 1700, the 18-year-old Swedish king Charles XII (kung Carl) — leading a vastly outnumbered army of 10,500 soldiers — ordered an immediate assault on the Russian fortifications, despite facing a force of over 35,000. A snowstorm, with the wind at the Swedes’ backs and directly in the faces of the Russian troops, created confusion and disorder in the Russian ranks. Following a chaotic retreat, the Russian army surrendered. The Swedes lost 667 men, while Russian casualties totaled around 9,000, and the entire Russian high command was captured.
Was this a good decision by King Charles? Based on the outcome, yes. And I would dare say that this is the most common way we evaluate decisions — based on their outcomes. But that might not be the best way. King Charles’s choice to engage a much larger force, after a long and exhausting march, against a fortified city, and in poor weather conditions, is at the very least highly questionable. His success likely did not rest on superior planning or sound military reasoning, but on something as uncontrollable as the wind direction during a snowstorm. This sounds like a bad decision, but one that resulted in a good outcome, what we might call dumb luck.
Have a look at the table below for a straightforward way to categorize and name combinations of good or bad decisions and their corresponding good or bad outcomes.
If a good decision is not simply those with good outcomes, then what is a good decision?
Well probably something along these lines:
A good decision is one that is made through a thoughtful process using the best available information, careful reasoning, and clear alignment with one’s goals and values — regardless of whether the outcome turns out to be favorable. It accounts for uncertainty, weighs risks and trade-offs, and remains open to revision as new information emerges. In essence, a good decision is judged by how it was made, not just by what happened afterward.
Yes. I mean that does sound like a good decision. But we do not really make decisions that way now, do we?
No, we do not. At least not all of our decisions. We are prone to a lot of mental traps, or biases. Often when we apply mental shortcuts or rules of thumb when we make decisions — or what is called heuristics.
Many of you will have heard of, or perhaps even read, the book Thinking, Fast and Slow by Daniel Kahneman. In it, he categorizes our thinking into two different systems: System 1 thinking, which is intuitive, associative, holistic, automatic, undemanding of cognitive capacity, and relatively fast; and System 2 thinking, which is reasoning, rule-based, analytic, controlled, demanding of cognitive capacity, and relatively slow.
We often fall for biases, and hence make bad decisions — or at least fool ourselves a bit. And it usually happens when we rely on System 1 thinking. Let us go over some of the most common biases.
One of the most well-known is confirmation bias. This is when we seek out or interpret information in ways that support what we already believe. We see what we expect to see, what we want to see. For example, if we believe a certain diet is healthy, we will focus on articles and testimonials that support it — and dismiss studies that raise concerns as biased or flawed. And we tend to apply much stricter scrutiny to evidence that challenges our beliefs than to evidence that supports them. It is part of why opinions get so entrenched, even in the face of new information.
Then there is the sunk cost fallacy, also known as escalation of commitment. This happens when we continue down a path just because we have already put time, money, or effort into it — even if that path no longer makes sense. I have seen this play out many times in practice — where investors continue to pour money into a bad investment simply because so much has already been committed, and they struggle to let go or accept the losses. Another example is, imagine spending hours watching a movie you are not enjoying, but refusing to stop halfway through because you "already spent so much time on it." Even though walking away would free up your evening, you stay and continue watching it — just to justify the time already lost.
Another common trap is anchoring. This is when the first piece of information we receive becomes a kind of mental benchmark — even if it is completely irrelevant. For example, James Montier conducted a test with over 300 fund managers. First, he asked them to write down the last four digits of their phone number. Then, he asked whether they believed there were more or fewer doctors in London than that number. Interestingly, the fund managers with higher phone digits (over 7000) estimated, on average, just over 8,000 doctors — while those with lower digits (under 3000) guessed closer to 4,000. A completely irrelevant number impacted their estimates.
We also struggle with framing effects — where the way a choice is presented can shape what we choose, even when the underlying numbers are the same. In one classic example — there is an outbreak of an unusual disease, which is expected to kill 600 people — participants were asked to choose between two health interventions. When the scenario was framed as saving lives, nearly 80% of people chose the certain option (200 people will be saved. This is the same as 400 people will die). But when the exact same scenario was framed in terms of people dying, only 56% went for the sure thing (400 people will die. Which is the same as 200 people will be saved). This was also from James Montier’s study. The math had not changed — only the wording. This is a clear example of a preference reversal, where people change their choice based on how the options are framed. And that is a serious problem for economic theory, which assumes that preferences are stable. Remember? We touched upon this in an earlier post: The Rational Man Is A Fable.
This brings us to loss aversion, another concept explained in the book Thinking, Fast and Slow. Losses hurt about twice as much as equivalent gains feel good. We are not just sensitive to outcomes — we are disproportionately afraid of losing. And that fear shapes far more of our decisions than we might like to admit.
Then there are the more subtle but equally powerful action-oriented biases, like overconfidence and overoptimism. We tend to believe our judgments are more accurate than they actually are. We overrate our ability to predict things, spot trends, or avoid mistakes. And we often think the future will work out better than it realistically might — especially when it comes to our own plans. Again, in James Montier’s study Behaving Badly (2006), 74% of fund managers believed they were above average at their jobs. They even got comments along the lines of: “I know everyone says they are, but I really am”. Sounds familiar? Let me ask you this: are you an above-average driver? Most of you will most likely answer yes. Am I right?
And finally, there are the so-called stability biases. These are biases that make us stick with what we know — even when change would be better. Status quo bias makes us favor whatever the current state is. It is easier to keep things as they are, even if they are not ideal. Think of someone who stays in a job they no longer enjoy, simply because it is familiar, the routines are in place, and change feels uncertain or exhausting. Combine that with sunk costs — all the time and effort already invested — and you get inertia: staying stuck.
Each of these biases on its own can push us off course. And in real decisions, they often appear together — reinforcing each other in subtle ways. And that is what makes decision-making so tricky. We are not flawed thinkers — we are just human. But recognizing these traps is the first step to spotting them more clearly when they arise.
And remember, many of the examples above came from a study where the participants were professional fund managers — people we can reasonably assume to be highly intellectual, well-educated, and trained in making decisions under uncertainty. Yet even they fell into these mental traps.
But I do not want to leave you here. While awareness of these biases is the starting point for improving decision-making under uncertainty — and you already have a leg up on many just by knowing about them — there are also specific techniques we can use. These are called cognitive repairs. The idea behind cognitive repairs is to “de-bias” our thinking — to reduce the errors and distortions that these mental shortcuts can cause.
These cognitive repairs can take different forms. Some are more informal — simple yet powerful mental habits we can build into our decision-making. For example, considering the opposite of what you believe, or deliberately asking yourself to think of an alternative explanation. You might use the 5 Whys to dig deeper into the root of a problem, or run a pre-mortem to imagine why a decision might fail before it is made. Some teams use red teams or devil’s advocates — people specifically tasked with challenging the dominant view. Even a good pros and cons list can help clarify your reasoning.
And here I need to bring up the fascinating and funny example of Darwin’s pro and con list of marriage, which he wrote before proposing to his cousin Emma Wedgwood in 1838. The picture below represents his first note on marriage.
(The original are in the Darwin Archive in Cambridge University Library. Source here)
On the pro side, he noted the appeal of having “a constant companion (& friend in old age)… better than a dog anyhow.” He envisioned a comforting domestic scene with a “nice soft wife on a sofa with good fire, & books & music perhaps.” Marriage, he thought, would offer “someone to take care of the house,” and might improve his health and social life by forcing him to “visit and receive relations.” Most of all, he feared the alternative: “Imagine living all one's day solitarily in smoky dirty London… No, no — won't do.”
On the con side, his tone shifted. He worried about the “expense and anxiety of children,” the “loss of time,” and not being able to “read in the evenings.” He disliked the idea of having to “visit relatives and bend in every trifle,” and was wary of “fatness and idleness” setting in. Worst of all, he feared marriage might mean “banishment and degradation into indolent, idle fool.”
Some of it reads today as unintentionally funny, or perhaps refreshingly honest — like weighing the value of a wife against that of a dog, or dreading the loss of solitary intellectual life as though it were a terminal condition. But what makes it enduring is not just the list itself, but the spirit behind it: Darwin treated marriage as a serious decision worth reflection, balancing emotion with clear-headed analysis.
And in the end, despite the cons, he decided to marry. Which they did on the 29th of January in 1839.
I think this is as good a place as any to end. Of course, as always, there is much more to say. But I hope you have learned something — or at the very least, been a bit amused or entertained. I certainly enjoyed writing this. Thanks for stopping by, and if you feel like it, please share the post or follow Deeponomics on the socials linked below.
Podcast on Spotify and Apple Podcast
References:
Kahneman, D. (2011). Thinking, fast and slow. London: Penguin Books.
Montier, J. (2006). Behaving badly. Global Equity Strategy, Dresdner Kleinwort Wasserstein. [online] SSRN. Available at: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=890563