Agile Zone is brought to you in partnership with:

I am a programmer and architect (the kind that writes code) with a focus on testing and open source; I maintain the PHPUnit_Selenium project. I believe programming is one of the hardest and most beautiful jobs in the world. Giorgio is a DZone MVB and is not an employee of DZone and has posted 636 posts at DZone. You can read more from them at their website. View Full User Profile

How to Take Unit Testing (and Test-Driving) Seriously

08.22.2012
| 15733 views |
  • submit to reddit

So we write some test code, then make it pass, and restart. If we have still some minutes in the current Pomodoro, let's refactor a bit and extract some methods. Easy right?

Well, Test-Driven Development has a simple definition but lots of implications and assumptions, especially about the refactoring part. Here are some tips from my experience of several years of TDD in PHP and Java, on web and machine learning applications.

Just good writing

The most often ignored refactoring technique is renaming. In a bit of BDD style, try to shape your test and test names with the GivenWhenThen pattern; prefer domain-specific terminology to technical terms that is more difficult to mess up .
public void increasesItsHeightWhenANewRowIsAdded()
is more readable than:
public void processingNewRow()
or even
public void testAddMethod()
When reading out the names of the test methods, the behavior of the class should be clear to you:
public void changesItsHeightWhenANewRowIsAdded()
public void reducesItsHeightWhenANewRowIsAdded()
public void anEmptyRowCantBeAdded()
Think also of the order in which you place the methods. When running a test in verbose mode, you are usually able to list the method names and think about missing test cases, if they're organized with simmetry and naming consistency.

Error cases

A problem with naive TDD is test-driving the design of the API (method names and arguments) but not error cases. Really thorough testing is not based on happy paths only, but also on managing errors; errors that will happen especially when dealing with external entities.

So the interface that you drive also should also be shaped by error cases:

  • return values. No return value is obviously trivial; otherwise, is it is a consistent type? Do you return false in some cases? Can you return a Null Object? There could be an interface to define for the result that comprehends good and bad results, like a Maybe type.
  • side-effects and notifications. The first kind are intended, mandatory effects of the calls you make. The second comprehends logging, events, and subscribers (like views). Try to reduce the efferent coupling (plainly speaking the number of objects you have as collaborators.)
  • exceptions are part of the API; @throws annotation and throws clauses model them. Some tests should be dedicated to try to raise these exceptions and check that they are actually thrown instead of a more generic Exception or NullPointerException class.

Refactor the tests

Extracting duplication is good practice also for tests; a whole bunch of patterns help you organize test code if you lack fantasy:

  • base test cases (which will only scale up to 1-2 parent classes).
  • External objects like the FixtureLoader pattern to build a fake database instance and its population.
  • Constraint objects to check results or arguments passed to mocks.
  • Assertion libraries.
  • Object builders for the setup of complex entities or graphs.

Refactoring the test code too means you won't throw away the suite at the first change in the specification, due to the sheer number of lines of code that needs to change in it. If you haven't the faintest idea of what the elements of this list are, take a look at the xUnit patterns book.

Incomplete stuff

Use incomplete tests for new functionality that you think of, to avoid multitasking; you should really finish this test, its implementation and refactoring before you go to the next feature.

In PHP, the test can be signaled as incomplete with $this->markTestIncomplete(), while in Java you can simply throws an exception, depending on your test framework.

The generalization of this technique to all development tasks is writing on a notepad, which I've learned from TDD by Example and in James Shore Let's TDD series. Paper works best for me if I'm not remoting with someone, but a .txt file is conceptually the same.

When you have a notepad ready, you can write down eveyrthing that you have to do when it comes to mind but it's not related to your current task. Additional tests and features can be inserted as methods but many times you can just write down a memo so that you don't even have to conform to a syntax or choose a name for the test now.

Refactorings to do and ideas to explore for them have to go on the notepad: you can't write a test for a refactoring, by definition. I use some sections like: next scenarios that are still not tests, or incomplete tests and possible variants for the scenario (happy path, error cases, notifications), refactorings to perform, infrastructure tasks (such as insert X in the build, do a spike with MongoDB.)

The notepad ensures you do a good job by letting you work on one task at the time but not forgetting anything that comes to mind. Every idea can be accepted and evaluated in the next Pomodoro; maybe you will implement it, maybe you would judge it as not adding value. :)

Published at DZone with permission of Giorgio Sironi, author and DZone MVB.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)