Deception and Estimation: How We Fool Ourselves
Scientific research suggests that humans are biased, not-very-rational decision-makers. We believe we see things clearly when the evidence shows otherwise.
I teach classes on influence, and after introducing a particular strategy, I ask the participants if they believe they would be influenced by that strategy, and almost all of them say they would not. We have such strong beliefs about our own abilities and intents.
Here’s one of my favorite examples. Wharton School marketing professor Jonah Berger asks participants in his studies if they would drive a few miles out of the way to get an item on sale that sells in a nearby store for the regular price of $75. Surprisingly, the answer depends on how the sale price is framed: as 25% off or $18.75 off! Berger has found that for an item regularly priced under $100, the percentage discount will always look larger and draw more customers. However, the reverse is true for items regularly priced over $100—that is, a $93.75 discount will attract more customers than 25% off an item regularly priced at $375. He calls this the “Rule of 100.”
How can this be? For rational decision-makers, there should be no difference between the percentage and the actual dollar discount. This result holds true even for very smart people—like you!
Take this built-in bias that we all have and throw in a big dose of optimism, and it's easy to see how estimating our own software work can be problematic.
Research shows that the best estimates come from a high-level comparison of the current project to others of a similar nature. Yet most estimation in software development comes from a bottom-up approach, by looking at the complexity of unknown components and then adding up the pieces to produce a view of the whole. Not only do we feel this gives us a better understanding of the current project, but we often intentionally ignore history by discounting earlier performance and looking at how those projects were different.
We need to recover the rational skills it takes to make informed estimates based on history, but that’s not so easy to do.
It seems we are loaded with thinking biases, and there’s no way of tossing them aside for the various decisions we make in our lives. So in order to make the most informed choices—whether for personal, team, organizational, or global decisions—we must construct diverse groups with others who see the world differently. These groups should include members who see the big picture, members who can focus on details, and members who are familiar with history.
We do not and cannot see the world clearly, and our biases are so strong that they will always prevent us from being objective. Our best hope for better decision-making is to work together and to really listen to what others tell us.
This article was originally published October 22, 2019 on TechWellInsights.com.
Linda Rising is an independent consultant who lives near Nashville, Tennessee. Linda has a Ph.D. from Arizona State University in object-based design metrics. Her background includes university teaching as well as work in telecommunications, avionics, and tactical weapons systems. She is an internationally known presenter on topics related to agile development, patterns, retrospectives, the change process, and the connection between the latest neuroscience and software development. Linda is the author of numerous articles and five books.