Strive for Excellence, not Perfection

March 2, 2012

Our mental models can get us into trouble. For example, managing very dynamic, highly variable efforts like software development though a series of crisp, well-defined, sequential phases seems to be very – to quote Mr. Spock – “logical.” Yet as many of us can attest, more often than not this logic breaks down rather quickly in actual practice.

The arguments for a sequential approach are sound, at least on the surface. You define your requirements, do your design, implement that design (code), and then verify (test) that everything is correct. Everything is in a nice neat, orderly, logical sequence of events.

What makes everything work well is the perfection of the outputs from each phase. The better execution you have in one phase leads to greater efficiency and effectiveness in downstream phases. In fact, studies have proven that finding and correcting problems (defects) increases significantly the deeper into the development phases you go:

What breaks down, and what can we do about it?

Software projects are the most unpredictable in the early stages of a project because there is a great deal of variability present in the early stages. How much, you ask? Let’s take a couple of statistics dealing with requirements, the very first phase:
  • Requirements defects in software projects account for approximately 50% of product defects.
  • The percentage of the re-work on software projects due to requirements defects is greater than 50%.
(Sources: Software Requirements 2 by Karl Wiegers and Code Complete by Steve McConnell)

That doesn’t sound too promising, does it? One thing that you don’t want – but can easily get with a sequential process – is that, “Now that I see it…” response when demonstrating the finished software. That means a change in requirements, after you’ve built the feature. And that is a very expensive way to build software.

Fortunately, variability does decrease, and Steve McConnell graphically represents how variability decreases over time with software projects with his Cone of Uncertainty (from his book Software Estimation):

As you can see, the cone is largest the earlier in the traditional project life cycle you are, but it narrows from uncertainty to certainty as different stages are reached – as those on the team learn more about the effort that they’re involved with. McConnell has observed that software projects typically reduce their variability at about 20 to 30 percent into the project because of the learning that takes place.

McConnell makes another vitally important about the cone of uncertainty. In an effort to obtain a reliable end date, a manager might ask, “What if we give an extra week to work on your estimate, can you refine it so that it contains less uncertainty?” Unfortunately, improving estimates requires working out the variability that drives the uncertainty, and this means performing the actual work of the project, so the answer is “no.”

If you can’t drive out variability without working on the project, you can try as hard as you like to be absolutely perfect early on, but you’re destined to fail. You have to go beyond the requirements phase and get into the actual work. Gaps will emerge as developers instruct a very literal computer just what it needs to do.

Even Agile development with its quick, iterative approach doesn’t completely solve the problem, either. After all, iterating is still a re-work strategy. But at least Agile development gets people working on a project very quickly, without wasting a lot of time with a Big Requirements Up Front and Big Design Up Front effort before people start getting really involved and learning what they truly need to know.

The lesson is simple: Don’t overinvest in seeking perfection – particularly in those early phases – that won’t be worth the paper that it is written on, particularly in the early stages. Seek excellence instead.

Excellence in software development execution starts with an acknowledgement that things can and will go wrong. The requirements might look right on paper, for example, and they may be “signed off” on as being correct. But they might be wrong for reasons that become apparent later.

I’m not making an excuse to be sloppy – excellence means being as crisp and clear as possible within reasonable time frames. But excellence is about all of those things that go into understanding software development and seeking to continuously improve. To look at how to improve in ways that I described in my last post, You Don’t “Do Agile.”

And excellence shouldn't be measured by who can write the longest requirements document, as Dan Naden just wrote about on VersionOne's Agile Management Blog. As Dan says, we should, "Deliver value in the most efficient way possible."

One way to achieve excellence – and to solve the requirements problem – is to invest in prototyping, following Marty Cagan's advice: “The majority of the product spec should be the high-fidelity prototype… I have long argued that requirements (functionality) and design (user experience design) are intertwined and should be done together.”

Seeking perfection can be very expensive and it can drive you nuts because perfection doesn't happen often with software projects. The pursuit of excellence, however, is a noble, satisfying, and rewarding experience.