Key Lessons from the Specification by Example Course, Day 1

I'm taking part in a course of Specification by Example, lead by Gojko Adzic. Here I want to summarize the key things I've learned in the first day of this entertaining and fruitful course thanks to both Gojko and my co-participants.

If you haven't heard about Specification by Example (SbE) before (really?!), then you need know that its main concern is ensuring that you build the right thing (complimentary to building the thing right), which is achieved by specifying functionality collaboratively with business users, testers, and developers, clarifying and nailing them with key examples, and finally, where it is worth the effort, automating checks of those examples to get not only automated acceptance tests but, more importantly, a "living documentation" of what the system does that never gets out of date. Best to read the key ideas described by Gojko himself or the SbE Wikipedia page.

The course is a well-balanced mixture of theory, war stories, and practical and fun exercises so there is a lot of opportunities to learn (and basically no opportunity to sleep, as sad as it may be for somebody). Here are the key things I personally have learned today (in the temporal order):
  1. Simplify the domain and examples by extracting individual concepts - when multiple concepts or concerns are mixed together, the specification and tests are too complex and numerous. Specifying collaboratively and involving many different people helps to spot those concepts that often are not very obvious.
  2. When under time press, don't rush to implementation without communicating sufficiently to build a shared understanding first. And, for God's sake, involve the customer! When time is limited, we must first of all agree on what and how to build. Everybody has different expectations so if we don't communicate, everybody will do what s/he believes is needed, resulting in lot of diverging efforts and thus waste.
  3. Find the time to define a clear sprint goal and objectives. They are necessary to direct us to where we are going and to tell us how far we've got. In each team, the answers to "how far you think you are?" varied between 30% and 80% with few honest "I've no idea".
  4. Simplify the domain and examples by choosing representations that fit the domain well. It should as simple as possible - but not simpler. For example in the domain of Black Jack, it's unwise to represent the total value of player's cards as an integer because 21 composed of three cards and 21 composed of two cards (= black jack) are quite different, also, for any value over 21, we only need to know that it's over the limit. Using just an integer makes the tests hard to understand and leads to unnecessarily many of them. On the other hand, a number is still a much better representation than a list of the individual cards, which contains a lot of unnecessary information (3+7 is no different from 7+3 or 2+8 or 2+2+6).
  5. Build a shared understanding of the domain, benefiting from the participation of different people to discover special and unclear cases. Successful teams manage to build a shared understanding of the business domain and scope and thus avoid the risk of building something else than expected. It's very surprising how different opinions people may have about even such a simple thing like the number of points in a five-pointed star, considering their understanding to be the obvious one (10 and 5 being the most frequent, with 11 from a vector-based graphic designer and some more weird ones). Thus talking together is really essential. Everybody brings something to the table - business users their domain knowledge, developers the knowledge of possibilities and limitations of the technical solution, testers their expertise in "breaking" the system. The more people (to a certain limit) the better.
  6.  Discover implicit (and often rather different) assumptions by constructing examples for a specification together.
  7. Technique: Set-based design for example representation: Split participants into groups and let each work separately on creating the examples for 10-15 min, then compare the representations they've chosen for the examples and key concepts, merge them into the best possible one. (We had three teams, each represented examples differently, at least two had important ideas that made it into the final representation, better than any of the original ones.)
  8. If you seem to need too many examples for a specification then you either don't understand the domain well, or are mixing multiple concepts, or have chosen unsuitable representations, or any combination of these.
  9. A user story is a suggestion for a solution, and usually a suboptimal one (for done by an amateur). Always make the business objective clear first. War story: The business requested their Bluetooth appliance to be extended with video streaming, which would be extremely technically difficult. It turned out that it's perfectly OK to deliver the video clips up front on a memory card and only play them from it on demand. Months of work saved.
  10. Specification by Example (collaborative specification, emphasis on the business domain) lead to aligning the business, software and test models. Thus a small change in the business model requires only a small change in tests and the software. The maintenance nightmare of unaligned acceptance tests and application, where a small change on the business side may require weeks of work, is gone. Of course this requires a lot of attention to creating the models aligned and keeping them so. See the Domain-Driven Design movement for more info.
  11. Decide the layer (UI, service, ...) where to test your application and the isolation of the tested module from the environment based on the risk covered and the effort to do so. The more right your tests are on the scale an isolated test - an end-to-end test the more risk they cover and the more expensive is it to develop and maintain them (and slow to run). For the same reasons it's rarely worth to test the application via the UI, as opposed to going directly to the service or other layer. Decide where to be based on the risk covered/involved and the (long-term) cost.
  12. SbE has *nothing* to do with integration tests. It may test the system end-to-end but doesn't need to. And it only tests a few key examples for each specification, not all possible values.
  13. The key benefit of the full SbE is that you gain "living documentation". (Not really a new thing for me but worth repeating. Regression tests usually catch ~ 25% bugs and thus aren't worth the effort. The shared understanding and living docs are, though.)
The tool to achieve many of the above is the Specification Workshop (or an alternative thereof), where business users, testers, and developers (notice the plurals) collaboratively specify functionality and derive key examples for the specifications, thus discovering hidden complexity and extracting individual (separately testable) concepts. War story: Once the business user/analyst said about a requirement that it is so simple that it doesn't need to be discussed. The team persisted and after half an hour all three business people were shouting on each other, unable to reach an agreement. The moral: If it's simple, always give it the 5min for examples. (And yes, there was a happy ending, some 2 weeks later.)

FYI: I should blog about the days 2 and 3 in a couple of days.

Tags: testing methodology


Copyright © 2024 Jakub Holý
Powered by Cryogen
Theme by KingMob