Decision making lies at the heart of our personal and professional lives. Every day we make decisions. Some are small, domestic, and innocuous. Others are more important, affecting people's lives, livelihoods, and well-being. Inevitably, we make mistakes along the way. The daunting reality is that enormously important decisions made by intelligent, responsible people with the best information and intentions are sometimes hopelessly flawed.
Consider Jürgen Schrempp, CEO of Daimler-Benz. He led the merger of Chrysler and Daimler against internal opposition. Nine years later, Daimler was forced to virtually give Chrysler away in a private equity deal. Steve Russell, chief executive of Boots, the UK drugstore chain, launched a health care strategy designed to differentiate the stores from competitors and grow through new health care services such as dentistry. It turned out, though, that Boots managers did not have the skills needed to succeed in health care services, and many of these markets offered little profit potential. The strategy contributed to Russell's early departure from the top job. Brigadier General Matthew Broderick, chief of the Homeland Security Operations Center, who was responsible for alerting President Bush and other senior government officials if Hurricane Katrina breached the levees in New Orleans, went home on Monday, August 29, 2005, after reporting that they seemed to be holding, despite multiple reports of breaches.
The reality is that important decisions made by intelligent, responsible people with the best information and intentions are sometimes hopelessly flawed.
All these executives were highly qualified for their jobs, and yet they made decisions that soon seemed clearly wrong. Why? And more important, how can we avoid making similar mistakes? This is the topic we've been exploring for the past four years, and the journey has taken us deep into a field called decision neuroscience. We began by assembling a database of 83 decisions that we felt were flawed at the time they were made. From our analysis of these cases, we concluded that flawed decisions start with errors of judgment made by influential individuals. Hence we needed to understand how these errors of judgment occur.
In the following pages, we will describe the conditions that promote errors of judgment and explore ways organizations can build protections into the decision-making process to reduce the risk of mistakes. We'll conclude by showing how two leading companies applied the approach we describe. To put all this in context, however, we first need to understand just how the human brain forms its judgments.
How the Brain Trips Up
We depend primarily on two hardwired processes for decision making. Our brains assess what's going on using pattern recognition, and we react to that information-or ignore it-because of emotional tags that are stored in our memories. Both of these processes are normally reliable; they are part of our evolutionary advantage. But in certain circumstances, both can let us down.
Pattern recognition is a complex process that integrates information from as many as 30 different parts of the brain. Faced with a new situation, we make assumptions based on prior experiences and judgments. Thus a chess master can assess a chess game and choose a high-quality move in as little as six seconds by drawing on patterns he or she has seen before. But pattern recognition can also mislead us. When we're dealing with seemingly familiar situations, our brains can cause us to think we understand them when we don't.
What happened to Matthew Broderick during Hurricane Katrina is instructive. Broderick had been involved in operations centers in Vietnam and in other military engagements, and he had led the Homeland Security Operations Center during previous hurricanes. These experiences had taught him that early reports surrounding a major event are often false: It's better to wait for the "ground truth" from a reliable source before acting. Unfortunately, he had no experience with a hurricane hitting a city built below sea level.
By late on August 29, some 12 hours after Katrina hit New Orleans, Broderick had received 17 reports of major flooding and levee breaches. But he also had gotten conflicting information. The Army Corps of Engineers had reported that it had no evidence of levee breaches, and a late afternoon CNN report from Bourbon Street in the French Quarter had shown city dwellers partying and claiming they had dodged the bullet. Broderick's pattern-recognition process told him that these contrary reports were the ground truth he was looking for. So before going home for the night, he issued a situation report stating that the levees had not been breached, although he did add that further assessment would be needed the next day.
Blind Spots of Decision Making Page 8 Page 10