Tuesday, August 23, 2016

Acceptance Test Driven Development

The purpose of ATDD is to determine the acceptance criteria for a user story. The business and the development team work together to come to a shared understanding. This understanding is used to drive (guide) the design, development and testing effort of the team.

Some background

In an agile development environment, the user story (ideally represented by a 3” X 5” index card) is a lightweight reminder to have a conversation. The card does not contain requirements, but is a placeholder for a future conversation. Requirements are inventory, and excessive inventory is a form of waste. So, we have the conversation when a card has been selected for development, Just In Time.

Sometimes, the PO or BA has jotted a few notes on the back. In classic eXtreme Programming, the on-site customer would have expanded these notes into more detailed customer test cases before work began. However, with ATDD we now discuss and specify those tests collaboratively in a Three Amigos meeting.

Three Amigos (or cuatro or cinco)

Originally named for the three participants (Product Owner, Developer and Tester), this meeting of minds has outgrown its name. However, the purpose remains the same.  Bring the business and the development team together to collaboratively specify the value to be produced by a user story.  This meeting includes a UX/Designer, Product Owner, Developer, Tester, and perhaps a Business Analyst, or Technical Writer.

The primary output for this meeting is a set of acceptance tests for the story card. The most common way of specifying acceptance tests is to provide concrete examples of expected behavior. All parties are in agreement that those tests represent the definition of done.

Acceptance Tests

Many teams choose to express acceptance tests using Cucumber scenarios. It is certainly not the only way to do so, but there are many advantages.

Cucumber’s simple, expressive structure encourages the use of concrete examples for expected behavior. This improves the conversation and makes the desired behavior more clear. Here’s an example:

Instead of:
  User must provide Social Security number for loan application

You create this scenario:
  Given the applicant is using the online loan application
  When the applicant proceeds without supplying SSN
  Then the applicant is informed that SSN is required

Secondly, Cucumber can execute these scenarios against your software, or the System Under Test (SUT). The development team can write code in Ruby [or another inferior language ;) ] to execute each scenario and indicate if the SUT performed as specified in the “Then” step. Now, the tester can focus on more complex Exploratory Testing (ET) since the basic behavior is covered by automated acceptance tests (scenarios).

As the number of scenarios grows, they will begin to serve as the source of truth on the expected behavior of the system. Instead of reviewing documentation that may or may not reflect the current desired behavior, the scenarios are executing against the real system.

Reading those scenarios is a great way for a new team member to get acclimated to the domain and the product. For this reason, the team must take care in using consistent domain language in the scenarios. With a little discipline, a suite of scenarios is a great asset to any team.

Since the suite of scenarios is the source of truth, most teams run them regularly as part of a Continuous Integration(CI) build. In doing so, regressions can be detected by a failing scenario. Note that the goal is not to write a scenario for every possible input or edge case. In an environment with complex or nuanced rules, more scenarios with concrete examples will help document those rules. In other cases, just a couple of happy-path scenarios per story card will suffice. Each team finds the right blend.

Why wait for the Three Amigos meeting?

One question asked of this approach is, “Why wait?” Wouldn’t the development team get more done if they were handed completed cucumber scenarios? Well, maybe, but usually not.

What must be emphasized is that the Three Amigos is not a “transfer of information” meeting. While the primary output is a fairly concise set of acceptance tests (scenarios), the Three Amigos serves a higher purpose. It is a collaborative discussion to specify the acceptance criteria. In the true spirit of the Agile Manifesto’s “Customer collaboration over contract negotiation”, the business and the development team are discussing options, asking questions, seeking to understand the true value behind the requested feature. Don’t forget the input provided by UX, the low-fidelity screen drawn on the whiteboard during the meeting, the Tester’s notes on performing ET for the card, the other stories discovered (or split out) for the PO to consider, the clarification on validation rules, etc. In fact, the team often discovers a cheaper way to get the desired results.

Finally, remember that “The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.” You are coming to a shared understanding, building relationships, and offering everyone a chance to participate in what is being built. You simply won’t get that by turning your talented team into order takers.

ATDD makes a difference

ATDD means pulling the team together to get a clear definition of done in a face to face meeting. Documenting it concisely (preferably with Cucumber scenarios) will mean less rework due to misunderstood requirements. Using ATDD, the whole team can now produce only what is necessary to meet the acceptance criteria. As a bonus, your software system gets a set of automated regression tests. This approach to communicating acceptance criteria is an effective addition to any software development team.

Wednesday, February 16, 2011

Book Review: The Agile Samurai

In November, I finished The Agile Samurai by Jonathan Rasmusson.  In short, I loved it.  This struck me as a bit strange at first because the book contains very little revelation about the craft of software development.  So what makes this book so great?  Well, I see a short answer and a long-ish answer.  For those who know me well, you'll understand why I won't bother an attempt at the short answer.

Agile Values and Principles 
To help you understand my perspective on this book, let me give you a little background.  The reason I sought to work for LeanDog (and I think the reason they sought me) was because of values.  While in its response to the agile buzz, the industry clamored to adopt superficial versions of agile practices in the form of books, tools, training classes, web sites, consulting fluff and slick marketing campaigns, Cheezy and Jon Stahl consistently and un-apologetically promoted the agile values through LeanDog.  To us it is clear that while the agile manifesto is not prescriptive, any claim to adherence must not arbitrarily choose which of the manifesto's values and principles to uphold.  The practices (which produce most of that buzz) adopted by various schools of agile thought are valuable only if they are firmly rooted in the realization of these values and principles.

A Book Emphasizing Principles
The Agile Samurai is an introductory level book.  Consequently, I think it was a bit courageous for Jonathan to write this book.  Though the agile movement is maturing, the original books written by the pioneers of the movement are far from dated or irrelevant.  So why re-hash it?  Well, that is not what this book does.  In my opinion, it approaches the topic with a freshening of how agile is now interpreted.  Yet, it is carefully rooted in the agile manifesto and its principles.  Sure, the practices are discussed, but in the proper context.  Yes, he helps the reader understand what agile teams look like and how they behave, but again rooted in the principles that inspired craftsmen to take the industry there.  By providing constant reminders of these principles the book does greater justice to the pioneers than the most thorough acknowledgment page could ever do.


The writing is clear, fun and concise.  The use of a dialog with Sensei to summarize each chapter is clever and a refreshing change from the typical bullet points. I consider The Agile Samurai the best introduction to agile I've read.  Even if you have done agile for a while, this book will bring you back to the roots and re-inspire you on why you have the habits that you do.  It will re-equip you to have conversations with skeptics and newbies.   Have a copy available for anyone who joins the team.

Saturday, January 1, 2011

Relationships and Results

As I am working on my review of The Agile Samurai, a thought occurs to me.  A team's ability to achieve results is inextricably related to the relationships in which the team participates.  What does the agile manifesto say about this?

Relationships
  • Helping others do it
  • Individuals and interactions over processes and tools 
  • Customer collaboration over contract negotiation
  • Business people and developers must work together daily throughout the project.
Results
  • Uncovering better ways of developing software by doing it
  • Working software over comprehensive documentation
  • Responding to change over following a plan 
  • Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
  • Working software is the primary measure of progress.
  • Continuous attention to technical excellence and good design enhances agility.
  • Simplicity--the art of maximizing the amount of work not done--is essential.
  • The best architectures, requirements, and designs emerge from self-organizing teams.
  • At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Both Relationships and Results
  • Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  • Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. 
  • The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  • Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
  • Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.


Now I don't want to be accused of engaging in studying the agile manifesto like theology and I don't intend to say that this is some hidden truth.  In fact, I'm half afraid that after posting this "revelation" I will discover that it is old news and I have added nothing to the conversation.  Further, I recognize that you may not categorize the elements of the agile manifesto in the same way that I have here.
In this instance I refuse to be deterred from exploring this (if only for my own growth).  I have always known that relationships drive results.  It is only recently that I have chosen to analyze the agile movement in this simple way.  I find it reassuring to recognize that when viewed through the prism of these fundamental ingredients to success, the agile manifesto strikes me as concisely relevant on a subject which has produced countless tomes.

Uncovering better ways of doing it (relationships)
In my own quest to uncover better ways of developing software, I have been challenged by time, lack of talent, tools, and technology.  I work around them, compromise, seek assistance, and/or negotiate enough to overcome and produce results.

In contrast, I have seen nothing in my career that thwarts results like bad relationships.  For some reason, I just find these challenges more difficult to overcome.  Perhaps if I spent as much time working on this intangible skill as I do on my technical skills I would be less daunted.  Only when I uncover better ways of relating to people will I truly be capable of helping others do the same.

Thursday, December 16, 2010

Testing Legacy .NET apps with SpecFlow

I've always wanted a decent tool for end-to-end testing of Windows applications.  But, now that BDD/ATDD is all the rage, I've become even more jealous that the Ruby crowd (the one to which I aspire) can use Cucumber to write executable requirements (to use Gojko's term).  That's right, to add insult to injury, Ruby developers don't just get a testing tool.  Instead, Cucumber allows the customer to communicate a feature by using specific examples of expected behavior.  Writing Cukes as the first step in playing a story card provides guidance for the developer to know they have completed the story and done so properly.  Like unit tests, the fact that you end up with a suite of regression tests (with many end-to-end tests) is just icing on the cake.

SpecFlow is a tool for writing these Cucumber tests in .NET.  As of version 1.3, it installs and uses the "official" Gherkin parser, so your feature syntax can be identical to the Cukes written by those Ruby show offs.  However, with SpecFlow your steps can be written in a .NET language like C#.

<digression>
Right now, all the Ruby geeks (yes, you Cheezy) are yelling, "WHY WOULD YOU WANT TO DO THAT?"  Now to be clear, it is definitely possible to use Cucumber and Ruby to test a .NET application. In fact, if I were testing a web application that is exactly what I would do.  However, controlling native Windows applications with Ruby produces some testing challenges.  Again, it is possible to interact with Windows applications from Ruby (using the win32 gems, for example), but I contend that SpecFlow is the simplest way to get started doing Cucumber testing for Windows apps, especially for a .NET development team and/or a .NET development shop.  But, as Cheezy constantly points out: If your long term goal is to have a formal tester role writing your step definitions, then Ruby is a much easier language for those team members to learn and use.  Maybe IronRuby is an answer, but I'm not ready to invest in that, yet.
</digression>

For the last 6 months, I've been using SpecFlow not for ATDD, but instead to test legacy .NET applications. The code fits the Michael Feathers definition of legacy code: No tests.  Certainly, I have been able to use the techniques in his book to introduce unit tests to this legacy code base, but using SpecFlow has proven equally helpful.  By writing tests which communicate how the existing features behave, I have been able to "reverse-engineer" the executable requirements.  Importantly, I can do so without any modifications to the code. This is by no means a silver bullet, but having these tests has certainly enabled me to more aggressively refactor.

In most cases, these applications are console applications or services.  More recently, I have written some tests that interact directly with classic WinForms applications.  I will post about the experience of using SpecFlow as a powerful enabler of refactoring legacy .NET code, including code examples which are re-creations of the challenges I've encountered.

Friday, September 3, 2010

Aruba Steps List

Aruba Given-When-Then
(For my own reference)

Finally found somewhere else: http://cheat.errtheblog.com/s/aruba/

Given /^I am using rvm "([^"]*)"$/

Given /^I am using( an empty)? rvm gemset "([^"]*)"$/

Given /^I am using rvm gemset "([^"]*)" with Gemfile:$/
Given /^a directory named "([^"]*)"$/
Given /^a file named "([^"]*)" with:$/
Given /^an empty file named "([^"]*)"$/

When /^I write to "([^"]*)" with:$/
When /^I append to "([^"]*)" with:$/
When /^I cd to "([^"]*)"$/
When /^I run "(.*)"$/
When /^I successfully run "(.*)"$/

Then /^the output should contain "([^"]*)"$/

Then /^the output should not contain "([^"]*)"$/
Then /^the output should contain:$/
Then /^the output should not contain:$/
Then /^the output should contain exactly "([^"]*)"$/
Then /^the output should contain exactly:$/


# "the output should match" allows regex in the partial_output, if
# you don't need regex, use "the output should contain" instead since
# that way, you don't have to escape regex characters that
# appear naturally in the output
Then /^the output should match \/([^\/]*)\/$/
Then /^the output should match:$/
Then /^the exit status should be (\d+)$/

Then /^the exit status should not be (\d+)$/
Then /^it should (pass fail) with:$/
Then /^the stderr should contain "([^"]*)"$/
Then /^the stdout should contain "([^"]*)"$/
Then /^the stderr should not contain "([^"]*)"$/
Then /^the stdout should not contain "([^"]*)"$/
Then /^the following files should exist:$/
Then /^the following files should not exist:$/
Then /^the following directories should exist:$/
Then /^the following directories should not exist:$/
Then /^the file "([^"]*)" should contain "([^"]*)"$/
Then /^the file "([^"]*)" should not contain "([^"]*)"$/

Monday, August 16, 2010

Error from misplaced behavior

I was recently faced with a strange problem at a client site and decided to share.  Perhaps the only value this will have is me simply venting.  However, I hope other developers will come across this entry and learn from my experience and we'll all benefit.  Isn't that the point of a blog? Venting, I mean.

So, we are using Visual Studio 2010 on .NET Framework 2.0 application written in VB.NET.  I'll give you a minute to stop laughing.....

Anyway, an object was generating an exception, but I couldn't catch it in the debugger.  In other words, I had a breakpoint in the constructor, but it was never being hit despite an exception being thrown: "Type initializer for 'SomeClass' threw an exception" or something very close to that.  I tried a few breakpoints, cleaned the solution, Rebuild all, etc. restarted VS and then decided something strange was going on.  Well, I went out to Bing  (I'll give you another minute....) to look up how to get Visual Studio to break on exceptions even when they are being caught and handled.  I was pleased to see that my friend Steve Smith had written one of several blog entries describing how to do this.  As I had hoped, this helped me to discover the problem, although being observant would have helped.

The code was something like this:

Public Class SomeClass

Private someFileLocation As String = SharedClass.SomeMethodThatThrewAnException() & "\folder\somefile.txt"

Public Sub New()
   LooksLikeTheFirstLineOfCodeToRun() ' Breakpoint here wasn't hit under default VS settings
    ...
End Sub

I hadn't seen (written..eh hem) this kind of code in a long time.  In fact, as soon as I found what was going on, I thought, "Well, I could have smelled that."  However, I just didn't expect anything like this and I didn't spend any time looking at the code above the constructor.  Enough said, really.

However, just to be thorough...

Variables should not be initialized in this way.  The exception cannot be caught in this class which is a code smell.  One should immediately recognize that the behavior of determining "someFileLocation" is not encapsulated.  At the very least, I would expect something like:
Public Sub New()
   InitializeSomeFileLocation()
    ...
End Sub

Single Responsibility Principle and Dependency Inversion
More importantly, however, is the fact that this class should not have the responsibility of initializing "someFileLocation".  This object should exist to provide a specific, narrow set of behaviors.  The behavior of creating the file location is a separate distinct responsibility.  Further, this class is dependent upon the details of how the file location is determined, when it should be isolated from that detail.  The file location should have been passed into the object instead and then I wouldn't have been forced to read Steve's blog.  :)


So, I used to see this kind of code and say, "Well, you just don't do stuff like this!" I would see it as symptomatic of a lazy developer saying, "Well, I need this file location, but constructing it is so simple, I'll just initialize an instance variable, what could go wrong?"  But now I realize that this is just an obvious case of violating principles, but others I have encountered (and created) are more subtle.

So, the next time you see something simple and ugly that obviously needs refactoring, take a few minutes to articulate to someone specifically why it is wrong.  It's good practice for those times when you smell something wrong, but it's not obvious why.

Saturday, May 22, 2010

No value in unit testing simple classes?

I recently read a blog entry by Steve Sanderson regarding selective unit testing.  I really appreciated both the candid treatment he gave the topic and the subsequent comments.  It really made me think about the habit I have developed of using TDD to develop everything (almost).  In fact, it is the "almost" that I strive to remedy.  I feel like I wrote stinky code when I have developed something that couldn't be TDD'd.  Some .NET framework classes are really hard to mock (no interface) or have so many dependencies that I simply can't bring myself to spend that much effort.  But, I digress.

One thing struck me about Steve's comments that I simply had to explore.  He stated, that for "trivial code with few dependencies...it doesn’t matter whether you unit test it or not."  He doesn't stress this position.  In fact, he states that he is perfectly OK with those of us who take the time to write tests for trivial code. 
However, I still feel these tests offer significant value to the code base and to the developers who must work with it.  First, writing these tests benefits the developer:

  1. TDD is a practice which focuses the thinking on creating the simplest solution that can work one step at a time.  It helps me think out of the box and sometimes produces an even simpler solution than the one I started to jump to.  Eventually, I can break down harder problems because I have so much practice at it with easy problems.  As Kent Beck says in his TDD book, you should always be capable of moving at a slow pace when necessary.  TDD of a trivial problem is a kata!
  2. Existing tests can serve as a template for a new test to drive a design change or a defect correction.  Call me lazy, but when the previous dev already has gone to the trouble to create all of the test setup, mocking, etc. it can save me time when correcting a defect or extending a behavior.  Yes, these happen even in classes I considered to be trivial.
These tests offer immediate and long term value to the maintenance of the code base:

  1. I am firmly convinced that trivial code often becomes more complicated (perhaps over-complicated if we don't watch).  For many systems I've worked on,it happens sooner than I think it will.  It is worth the small effort to write the tests to assist in the later refactoring.  Why not do it while the context of the original required behavior is fresh in your mind?
    • Yes, you could add tests after realizing they are needed. But that doesn't produce the same kind of tests and ignores the Red-Green-Refactor design cycle which I contend still has value for trivial problems.
    • Have you ever discovered that your "simple" class actually has too many responsibilities?  It is so much easier to refactor into multiple classes with tests on the behaviors. 
  2. Even if refactoring never happens:  My idea of trivial may not be the next dev's idea of trivial. Tests help that person understand what behavior I intended for the method as well as which edge cases I considered.
    • Yes, the source code expresses the actual behavior, but tests developed as part of TDD record what the developer thought the behavior was supposed to be. 
Finally, every piece of code I write starts off being simple (ie trivial) and eventually evolves into something more complicated.  I simply can't see myself waiting until a class reaches some arbitrary level of complexity before deciding to start writing tests.