Automation of release notes in agile projects
My team works in short iterations at the end of which we should be able to release a new version of the software with added value.
To facilitate the generation of release notes, another team I was observing is using markups in their version control commit message: they add [R] in the commit message to denote the addition new features to the repository.
I thought it was a good idea and introduced it as a version control standard in my team.
After a couple of iterations, there weren't many [R] in the commit messages.
We found ourselves sometimes shying away from adding an [R] to the commit message, as we're not always sure a feature is done or not. Also we do commit often, I'd say compulsively, at various steps of our work: it's easy to commit the last chunk without realizing it's the last. It is also easy to mark a commit with [R] on a chunk of code for which some files are accidentally missing from the commit.
After further investigation, I discovered a difference in branching strategy between the two teams:
In my team we branch on ad-hoc basis when we are about to start a risky task, but keep working on trunk for work with limited scope and impact. The other team is systematically creating branches for ANY new work they are doing, which means that merging branches (after running the test suite on that branch) to trunk define what is a completed feature and the [R] mark tends to be found on commit message for merges.
Since we are not applying a branch for every feature policy we need to find other ways to identify task done in an iteration. What we are doing though is acceptance test driven development.
It starts with a user story, then the associated acceptance tests. We use fit to write FIT tables for acceptances and write the fixture code for the automated test. The Fit infrastructure is held in Fitnesse which is a wiki-on-steroids with an integrated FIT runner.
Each user story is en entry in the wiki, and Fitnesse allows you to create virtual wiki links on wiki entries which allow easy creation of indexes (dynamically or statically updated).
We are already using these facilities to regroup all completed and validated stories on a page called RunningTestedFeature which is then executed as a suite as part of our continuous integration process for regression testing.
The interesting bit is that in a similar way, for each iteration we can also create a page linking to all stories planned for that iteration at the end of which the suite is executed . The stories whose acceptances tests passed become the bullet point item in the release note if we decide to release.
And there is an added bonus: by keeping these iteration index pages over time, it will help the calculation of velocity as it becomes easy to count how many stories are completed per iteration.