Thursday, November 20, 2008

UTs and more

Testing Levels
UTs (unit tests)
ITs (integration tests)
CTs (customer tests)
ETs (end-to-end tests)
ATs (exploratory acceptance testing)
STs (system tests)

Wow, I wish I saw a simpler way to assure the system I'm building is what I expected. I remember a few years ago when a colleague from Agile Philly, Naresh Jain, mentioned he uses quite a few levels of tests... a number that was unfathomable to me at the time, but now after seeing a few bugs slip through my tests, I think we really do need 5 or 6 levels too (I don't clearly recall his names or levels).

I think one of the reasons I like so many levels is for reasons of abstraction; another reason is for ease of maintenance in ways that James Shore talks about; yet another reason has to do with the 10-second cadence so many Agilists talk about--we keep each UT shorter than 10-milliseconds as Michael Feathers refers to in Working Effectively with Legacy Code.

Testing Goals
UTs (unit tests)
Test one responsibility of an object; this should be 3-5 (or maybe up to 15) lines of code, in isolation as much as possible from collaborators and complicated initialization. This is easiest when you're following Robert C Martin's SRP (single responsibility principle)... one easy way to help you get there is to make a member of a class static, and push all parameters to simple data types. UTs should be exhaustive... test everything with a significant chance for error. Ping pong your way through any scenario you think will break the SUT (subject under test).
ITs (integration tests)
Maybe those 3-5 lines do just what you expected, but now that you have so many little methods, you've increased the risk that data won't be handed down a call stack correctly. You need tests that confirm that data goes from one level to the next; this test may even cover 3 or 4 levels... but after that, you're treading dangerously close to an expensive end-to-end test. We don't need to test all permutations here--we're just confirming the hard wiring is connected correctly.
CTs (customer tests)
Customer tests are new to me--I read about them first in The Art of Agile Development by James Shore and Shane Warden... but I think they hit dead-on with the older concept of end-user/client comprehensible tests that prove the business functionality does what it's supposed to do. In reality, I think we should start development with failing CTs, verified by the business user since they'll be expressed in an xUnit framework with a DSL (domain-specific-language). Once you've encoded your business requirements for a story card, go ahead and start a ping-pong session at the lower levels of abstraction.
ETs (end-to-end tests)
End to end tests are yet another form of integration test, but they're expensive and slow... they require that you launch your real user interface, connect to all resources such as a database or web services, and do something that has meaning. I'd usually add an ET for every major subsystem, page on a web site, or user interface screen. Some people call these smoke tests, because the point is only to confirm we didn't break some interoperability. True business functionality (CTs), and all conceivable permutations (ITs) have already been validated.
ATs (exploratory acceptance testing)
Once upon a time ATs were supposed to be like ETs... but thinking here has changed, and instead we want to make sure that at the end of every iteration every developer has a chance to do some manual testing to try out what other members of the team made, to review the code related to the changes, and to allow the customer to experiment with the new functionality in a safe environment--a 'try it before you buy it' test. This gives developers practice thinking like a customer or end-user, allows them to better collaborate and support an application, and helps the team get a good feeling for the quality of the product.
STs (system tests)
Many systems have performance requirements, external interoperability tests, or other general requirements that wouldn't be tested otherwise. Automate these if possible--but they're likely to take hours to run.

Thursday, November 6, 2008

JDUF - Just Enough Design Up Front

I was reading Michael Groeneveld's reactions to Agile 2008 and thought the Iteration-1 idea was quite interesting. He says it could be controversial because it increases the amount unvalidated work, and that it seems like BDUF (big design up front), and therefore antithetical to Agile. However, he goes on to point out that there are sometimes more effective ways to get feedback from a client than to demo working software. I think that early in a problem I tend to thrash around with unit tests until I figure out how to solve the problem efficiently... and as a result I waste time that could have been saved if I just stood at a white board with a pair. Back in my job in Philly, we called this JDUF -- Just Enough Design Up Front. XP never advocated blind coding... it just requires us to wait to do this design and planning until the last responsible moment, and then not to let that design rot before releasing it to the customer.