Save Your Software from the Start: Overcoming Skewed Thinking in the Project Planning Stage
In the context of a software project, crisis management is much more expensive than crisis prevention. Yet, despite this knowledge, few people develop a rational approach to spotting trouble early. Some waste time looking for problems in the wrong places, while real danger lurks in areas they least expect. Still others spend little or no time trying to spot trouble ahead, choosing instead to remain blissfully in the present.
Two types of obstructions prevent us from looking ahead for things that might cause a future crisis:
- We don't plan time to look ahead. The best way to overcome this issue is simply to plan time to think about the future. Some of the best project managers specifically schedule a little "think time" every week. They use this time to get away from day-to-day tasks and rethink their strategy. They also try to identify potential problems that may threaten the project.
- Cognitive illusions prevent us from assessing situations accurately. This kind of obstruction is much more pernicious. We all have a host of these kinds of illusions. To give you a flavor of these shortcomings in our mental processes, let's examine just two.
Two Common Cognitive Illusions
One kind of bias, anchoring, works as follows: We start out with an assumption. From that point on, all thinking revolves around that anchor. This problem commonly occurs in business planning. Executives select an objective and then write a plan. All assumptions that go into the plan—and all actions to be taken—magically support that objective. In this situation, planners tend to ignore anything that proves the objective unreasonable.
Anchoring was demonstrated in its purest form by an experiment [1] in which participants were asked to write down the last four digits of their social security numbers. Then they were asked to estimate the number of physicians and surgeons listed in the yellow pages of their city. The estimates strongly correlated with the last four digits of the participant's social security number. (Go figure.)
The availability heuristic is another common source of bias in problem identification, causing us to do things such as support a one-sided argument. Brenner, Koehler, and Tversky nicely demonstrated this cognitive shortcoming in their study. [2] Experimenters gave four groups of participants different pieces of information from a real jury trial and then asked them to estimate what the jury decided. One group heard both the defendant's and the plaintiff's arguments. Two of the other groups heard the arguments presented by only one of the sides. The fourth group got both sides of the story plus some background information.
The two groups with information from both sides were most accurate in estimating what the jury decided. That makes sense. But some findings were less rational. The two groups with arguments from just one side estimated that the jury decided in favor of the arguments heard by that group. Even though the group knew they were biased by hearing arguments from only one side, they were unable to compensate. Worse still, when experimenters asked the participants how confident they were in their estimation of the jury's decision, the answers were surprising: Those who heard only one side were much more confident in their analysis than those who heard both sides. That is, the groups who knew they were skewed were still more confident in their selection. This scary finding shows how flawed our thinking can be when we assess a situation.
Some Research on Rational Problem-Solving
In the 1950s, Charles Kepner and Benjamin Tregoe were commissioned by the United States military to investigate problem-solving. [3] Through their research, Kepner and Tregoe found that many people were really bad at identifying root causes. These people ignored important facts and were generally biased from the outset.
On the flip side, Kepner and Tregoe found that some people were very good at spotting the root cause and selecting solutions. These people thought through the challenge in front of them. They were conscious of their approach, which they recognized as repeatable; they had a firm understanding of their problem-solving process; and they were able to explain their system.
The two researchers concluded that poor problem-solvers were not methodical, and good problem-solvers were very methodical. Only people who applied a rational approach could overcome the inevitable skews in the human thinking process.
Daniel Kahneman is another person who came up with techniques we can all use to overcome some of our cognitive shortcomings. A renowned psychologist, and winner of the Nobel prize in economics, Kahneman writes about the two ways of viewing a problem: the inside view, in which you look only at the problem in front of you; and the outside view, in which you view the problem as part of a category. [4]
By taking the outside view, Kahneman says, you can eliminate some of the inevitable biases of considering just the challenge at hand. The outside view allows you to compare the problem to similar problems, taking a more objective view of how best to approach it, and estimating how much time and effort will be required to solve it.
Six Steps to Heading Off Problems Early
Let's put these ideas to use in the context of software projects. By following six steps, you can identify risks and take measures either to prevent the occurrence of problems or minimize their impact.
For each of these steps, keep in mind the cognitive skews that might affect your perception. Being aware of how your thinking might be biased will take you a long way toward compensating for any illusions.
Step 1: Identify all possible risks. One rational approach to risk identification is to take the outside view: Ask people who have worked on similar projects what surprises they encountered.
Another method for minimizing cognitive bias in risk identification is to think about risks by category. For example, software project areas of concern might include project management, budget, and technical categories:
• Project management risks include problems caused by miscommunicating requirements, or problems resulting from poor planning.
• Budget risks include overspending due to delays, or financial hits due to unexpected licensing costs.
• Technical risks include bugs in third-party software, or hardware bugs.
For each category, sit down with team members and discuss potential problems that might occur in each area. This approach will help you to structure the process of risk identification.
Step 2: Assess the probability of each risk. Few activities are more prone to skews than assigning probabilities of occurrence of future events. Again, one of the best ways of minimizing bias is to seek advice from people who have an outside view on the project. Try to find out how often certain problems arose in similar projects.
Step 3: Assess the potential impact of each risk. Start out by thinking about how the problem might affect your project: How will the problem change your schedule? What impact will it have on your budget? How might it affect the quality of the software you produce?
When you know each area of impact and its relative importance, you can get a good idea of the overall impact the risk might have on your project. Now assign a value to that impact—in dollar amounts, if possible.
Step 4: Based on the combination of probability and impact, prioritize your risks. You might simply multiply the values for probability and impact to get an expected cost. The risks with the highest expected costs are your highest priorities.
Step 5: For each high-priority risk, think about how you can prevent it from occurring. Assess the cost of prevention and weigh it against the expected cost should the risk occur. Is it worthwhile to take measures to prevent the potential problem, or should you just live with the risk?
Step 6: For risks you can't (or won't) prevent, think about how to minimize the damage should that problem arise. Based on the cost of preventing certain risks, you may decide not to take preemptive measures in those areas. You also might encounter unforeseen risks that couldn't be prevented. In both cases, think of what you can do to spot the problem, should it occur, and then minimize the damage.
Summary
For software projects, which can range from just plain complicated all the way up to mind-bogglingly complex, the old adage rings true as ever: Project managers and software engineers who take a rational approach to planning consistently find that "an ounce of prevention is worth a pound of cure."
References
[1] Timothy D. Wilson, Christopher E. Houston, Kathryn M. Etling, and Nancy Brekke, "A New Look at Anchoring Effects: Basic Anchoring and Its Antecedents," Journal of Experimental Psychology: General, Volume 125(4), December 1996, pp. 387–402.
[2] Lyle A. Brenner, Derek J. Koehler, and Amos Tversky, "On the Evaluation of One-sided Evidence," Journal of Behavioral Decision Making, Volume 9, 1996, pp. 59–70.
[3] Charles H. Kepner and Benjamin B. Tregoe, The New Rational Manager, Princeton Research Press, 1981.
[4] Daniel Kahneman and Amos Tversky, Choices, Values, and Frames, Cambridge University Press, 2000.