(Un)recipe for decision-making

When it comes to decision-making, our choices aren't always rational. But is there a good recipe to make better decisions?

by Veronica Pelazzini


It’s a rainy day in the ’90s. You just got home after a long day, and all you want to do is unwind by watching a movie in the comfort of your apartment. The answer to your need is simple: Blockbuster — the most popular video rental store, founded by David Cook in 1985. Flash-forward to 2023, and you’re still on your couch, rain pattering just outside. Your eyeballs are glued to the screen, but the grainy frames of sub-HDTV have given way to Netflix, in all its real-time-streaming glory

A lot of things have happened in the intervening years, and some of them have led to Blockbuster’s demise. Although debts and changes in the industry contributed to its bankruptcy, bad (or irrational, some would argue) decisions were made. The most famous: when Netflix founder Reed Hastings decided to sell Netflix to Blockbuster for $50 million in 2000, the latter failed to conduct a comprehensive analysis of the deal, which fell through. The consequences of this decision were disastrous. 

Maybe you remember Balenciaga’s unfortunate November 2022 campaign, “Gift Shop” — a more recent example of a decision-making short circuit. The campaign featured, according to many, sexualized children, and the rage on social media was followed by an official message on Balenciaga’s Instagram profile, in which president and CEO Cédric Charbit explicitly claimed that the Gift Shop campaign “was a wrong choice by Balenciaga”. 

These are two of the endless cases where better management decisions could have been made. The questions we want to address here are, first, whether the best decisions are rational decisions, and, second, whether there is a recipe for making optimal decisions, especially when organizations are involved. 

First, I did what everyone does these days: I summoned ChatGPT and asked it what are the circumstances that make a decision “rational”. The answer: “Rational decision-making is a structured and objective process that is based on available evidence and is in the best interest of the decision maker or the organization.” We are taught that sound decisions can only come from a clear, logical mind, and that emotions must be excluded if we are to find the best solution to a problem. A rational decision, therefore, appears to be objective, informed, and consistent: decisions are considered rational when they are based on logical reasoning and evidence. In real life, however, many of our behaviors seem to defy logic, and if we look closely, we can also see that contradictions tend to repeat themselves, following recognizable patterns and animating heuristics and biases. 

To understand why and how this happens, we need to look at how we make decisions, because the more we understand that the better we can manage them. And, maybe, get closer to answering the question of whether there is, in fact, a recipe for good decision-making.

The evolution of our brains has allowed us to reduce complexity by simplifying, using a set of mental shortcuts called heuristics — which, in some cases, can lead to bias.

Neuroscience research, using functional magnetic resonance imaging (fMRI), has shown that the prefrontal cortex, which covers the front part of the frontal lobe, plays a critical role in decision-making, allowing us to take a long-term view when evaluating rewards and risks. But how is the prefrontal cortex relevant to our decisions? We can try to gain insight by looking at the stories of people whose conditions, accidents, or injuries have allowed researchers to better understand the brain and its impact on behavioral and cognitive functions. Phineas Gage is a name you almost certainly encounter when approaching the study of the nervous system.

The event which marks Gage’s reputation as the man who began neuroscience traces back to 1848, when he was a 25-year-old railroad worker. That year, he experienced a severe accident caused by an iron bar that entered under his left cheekbone and smashed his left frontal lobe. Gage survived the accident but, according to the notes of the doctor who followed his case, John Harlow, his personality and behaviors were dramatically affected. This case is remarkable for neuroscience because it demonstrates the role of the prefrontal cortex and, needless to say, how strongly Gage’s decision-making was altered by the injury to this area. 

A similar story is that of Antonio Damasio’s patient, Elliot, who, as Damasio reports in “Descartes’ Error” (1994), passed from being an exemplary businessman to losing track of work responsibilities. A brain tumor pushing into his frontal lobe was later removed, leaving Elliot unable to effectively choose and make decisions, due to an overly detailed analysis of all the possible perspectives when approaching any problem. The damage did not affect his rationality — indeed, his IQ, memory, language, and so on were satisfactory: rather, his emotions were altered, thus illustrating the essential link between emotions and decision-making.

The prefrontal cortex, though, is a recent evolution and, still, our ancestors made decisions: researchers have found that patients with damage to the limbic system (a more ancient group of brain structures essential to emotion generation) too struggle with decision-making. People, indeed, use two types of mental processing when making decisions: a fast and unconscious system (named “System 1” by Daniel Kahneman), that is often emotion-driven, and a more analytical and deliberative system that “rationally” weighs benefits and costs (known as “System 2”). Therefore, not only formal logic guides our decisions: our emotion-driven, older brain is also heavily involved in every decision we make, even rational ones. No matter how sophisticated our frontal lobes are, our choices are not always optimal, and rules of thumb inevitably guide at least some of our decision-making. 

Moreover, the environment in which we live is not only different from the time in which our cognitive mechanisms evolved: it is now even more complex, and gathering and analyzing all the information around us would require endless time and resources — too much information, too little time. The evolution of our brains has allowed us to reduce complexity by simplifying, using a set of mental shortcuts called heuristics — which, in some cases, can lead to bias. A typical example of a heuristic that is critical to decision-making is framing: the way a problem is framed (or presented) strongly influences our reactions, as in those cases in which we prefer a package of yogurt that says “80% fat-free” to one that says “contains 20% fat”. The consequences of framing may be more significant than we expect: most people would be willing to undergo surgery with a “95% success rate”, but would hesitate if they read that the procedure has a “5% failure rate”. 

All of these elements influence not only decisions made within one’s private life but can have major consequences for organizations, such as decisions involving whether to retain or let go of an employee, enter new markets, merge with other companies, or delegate responsibilities. Looking at successful business cases of the past, we would say that the experience and intuitions of leaders have been the most critical driving force behind decision-making processes. It is also true that abysmal decisions have been made because business leaders, the very ones who have led their companies to success, have made decisions based solely on their instincts and experience. Again, Blockbuster’s leadership made a series of decisions that led to the company’s downfall, and the error in validating the content (and context) of the Gift Shop campaign put Balenciaga in the middle of an ignited firestorm.


We mentioned that biases can arise from heuristics, and biases are likely to have disastrous consequences for organizations. Cognitive biases can be defined as a set of predictable mental errors that can lead to irrational decisions, causing people to misjudge risks and threats — for example, due to confirmation bias, we may look for information that supports our beliefs while rejecting data that contradicts them. And, if biases are at the root of many irrational beliefs — and irrational decisions as a consequence — one may argue that automated decision-making, which involves the use of algorithms, data, and machines, would better replace executive decision-making.

Organizations can indeed use data to make crucial decisions that may lead to increased revenue, improving internal processes, acquiring more customers, and retaining them over time. We should not forget, however, that wrong decisions can be made even when data and analysts are involved. For example, an organization may have a lot of data that is not relevant — perhaps because it omits key variables, has an intrinsic bias, or the data sources are unreliable. Or, the organization may have very good data that is poorly analyzed: in this scenario, good data can lead to poor decision-making, resulting in data-driven mistakes. 

Possibly, the best-case scenario involves a reasonable balance between data and human factors, and to achieve this, leaders and managers should cultivate emotional self-awareness — i.e., for instance, becoming able to recognize how moods affect their decisions. This is especially true when it comes to excitement and fear, which are strictly related to reward-seeking and loss aversion. Decisions are indeed more often than not linked to leaders’ attitudes: are they risk-avoiding, or do they thrive on the chasing of risky deals? These are all key elements of emotional self-awareness, which fosters the understanding of one’s emotions and their effects on performance. Therefore decisions, even when supported by data, would be better if combined with leadership qualities such as emotional intelligence, the ability to weigh evidence with intuition, and the ability to deal with uncertainty.

Leaders’ decisions are not only fundamental to business success, but they also have a powerful impact on the people around them and the decisions they make – in a kind of cascade effect – as well as on their workspace. This last point is not only related to strictly physical spaces: leaders have the power to establish an environment that fosters and supports diversity, understood as a variety of skills, profiles, personalities, and backgrounds. A diverse environment can be seen as an antidote to the risk of standardization, either based on intrinsic biases or decisions based solely on individual responsibility. In addition, leaders can foster a stimulating and challenging work environment: one that promotes and cultivates divergent opinions that ultimately lead to stepping out of one’s comfort zone – in short, a work environment where confirmation bias is averted and empowering challenges are encouraged. 

So… how can we avoid a Blockbuster-like catastrophe? Even though there is no recipe for decision-making, people — and leaders in particular — may focus on its key ingredients: employing data analysis for assessing background information, cultivating social relationships comprising individuals who offer critical viewpoints, and empathetically listening to their perspectives, even — and above all — if they convey unwelcome information, without forgetting, at the same time, to train emotional self-awareness.