If you answered “yes” to these questions, you may be in for a surprise. Automated code coverage isn’t measuring everything you need to know about your code base any more than your speedometer on your car is measuring how safely you are driving. Your speedometer is a gauge, providing a metric that you need to consider along with other factors, such as:
- Are you driving at the posted speed limit?
- Are you driving in inclement weather?
- Are you texting while driving?
Can this actually happen? Let’s assume that you are a manager, and you genuinely want to see your organization improve. You’ve heard the phrase, “If you can’t measure it, you can’t manage it,” so you set an aggressive goal of 100 percent automated unit test coverage, reviewing statistics weekly with the clear directive that you expect to see improvement. You remain resolute despite developers pushing back, complaining that the current code base was never designed for automated testing.
Despite the objections, the needle starts to move. Before long, you observe that automated code coverage in increasing at an amazing rate. But there are no other noticeable improvements, like a reduction in defects or the ability to add new features even moderately faster than before.
You realize that you have been gamed. Behind the scenes, some creative developers have added automated tests that invoke code without asserting (testing) anything! Others added virtually meaningless lines of code already covered by automated tests to increase the code coverage metric, giving you the illusion that you were moving in the right direction.
You consider adding additional metrics to counter this problem. Measuring cyclomatic complexity comes to mind because the lower the cyclomatic complexity, the easier it is to understand, modify, and test the code. And you consider mandating a practice like Test-Driven Development (TDD) to ensure that any new code is covered by automated tests.
This is an unhealthy dynamic. The focus is on the metrics and compliance. Software development is an exercise in product development, a collaborative, people-oriented endeavor. If we want people to do the right thing, we need their understanding, buy-in and support. And in order to accomplish this, we need to focus on the desired outcomes, not the metrics in and of themselves.
Metrics have their place, and that is to help us gauge whether we are making progress along our chosen path while taking the realities of our individual circumstances into consideration. Code coverage isn’t our goal. What we want is the ability to continually add new features to the system quickly and easily, with the confidence that we aren’t breaking other things in the process. This is why one of the principles of the Agile Manifesto states, “Continuous attention to technical excellence and good design enhances agility.”
There are some basic principles that come into play here:
- Begin with simple design or refactoring to “make space” for a new feature that keeps an existing design simple and direct.
- Write clean code.
- Leverage automated unit tests and a practice such as TDD that help us to write well-designed, clean code that gives us immediate feedback on our efforts.
I can’t emphasize the last part enough. I’ve updated code in the past that I’ve felt absolutely compelled to review with the original author because, as Michael Feathers once said, [the code] “looks like it was written by someone who cares.” (Martin, 2009)
Have you ever had the experience of working with someone else’s code that elegantly turns a complex problem into something simple without being simplistic? Where the execution of the code worked as you would expect it to from reading it? It happens all too infrequently, in part because we’re always in a rush to get things done. But with agile, we’re trying to get things done well, creating an opportunity to implement the best design possible.
This is where a practice such as TDD can be a help. From an expectations standpoint, don’t equate TDD with automated unit tests. Unit tests are a type of test, TDD is a practice that has three distinct steps:
- Write a test that fails
- Make the code work
- Refactor to eliminate redundancy
- The developer considers how his or her code will be used, helping to drive the design and implementation of clear, understandable interfaces.
- In order to write an automated unit test, the code must do one thing well and not many, which reduces coupling and complexity.
- Writing a test first means that automated tests aren’t negotiable; it can be tempting to skip writing an automated test once working code is in place, particularly if there is pressure to “get right to work on the next feature.”
Is TDD for everyone? Probably not; I’ll concede that writing failing tests first and the implementing working code can go against the grain of how some individuals think so greatly that it can interfere with their productivity. However, ask yourself if this is the case for you or if it’s the change in approach that diverts cognitive cycles that will make you temporarily less productive – and a little uncomfortable. If it’s the latter, then what you need is to set the expectation that you won’t crank out as much code until you get used to doing something new and different.
Even if you reject TDD as a practice that’s suitable for you, the discipline behind TDD shouldn’t be rejected. We want to design first, then code. And the design and implementation of the code should capture the qualities mentioned above.
In the final analysis, we want to produce the same thing at the technical level with agile development as we do at the business level: a valuable outcome. It is our ability to quickly adapt and modify our code that is a key contributor to our overall ability to be responsive. Practices and tools help us to get there along with various measurements that keep us informed, but don’t manage strictly by metrics or mandating practices. Apply them in context of that bigger picture outcome and your own unique circumstances.
This post is a draft of content intended for an upcoming ebook: Agile Expectations: What to Expect from Agile Development, and Why. Blog posts will be organized under the “Agile Expectations” content on the left-hand side for easy reference. I welcome your comments and feedback! – Dave Moran
Bibliography Martin, R. C. (2009). Clean Code: A Handbook of Agile Software Craftsmanship. Boston, MA: Pearson Education, Inc.