This is the seventh of 12 posts about the principles of agile software development. Purpose is to go back to the start of the agile manifesto (http://agilemanifesto.org/principles.html) and discuss the implementation of the 12 principles in real life software engineering. Goals of agility are to go deliver software of higher quality, faster, with a higher acceptance to end-users and able to follow the changing business requirements to strive for competitive advantage.
The question is: is this going to work in practice or is this only based on a nice marketing and sales story.
Principle 7: Working software is the primary measure of progress.
How do you measure progress in agile projects? The required functionality is not fixed and the planning of construction and delivery of these requirements is done by the team, in a very late stage. This is something traditional project managers have a hard time to cope with. They think it is impossible to control a project, with an unclear outcome and a planning, that is based upon a work backlog and the duration of a sprint (iteration).
The fundamental measure of progress is measuring things that are finished. Software (in our case) is finished when it is successfully tested and delivered. Software progress is not “80% done coding”. Software is finished when it is tested and accepted by the (key) end-user. By adding the end-user to the project team, he/she will be motivated to test and accept the delivered artefact quickly.
Working software vs. according to specs
Please note here that there is a difference between "working software" and “according to specifications”. Software can be perceived as working by the end-users while not being according to specs. In a many projects, developers spend a lot of time on details that do not add value to the end-user. This in only done because this is stated in the specifications (which are often done months agoJ). Developer and end-user (working together on day-to –day basis) jointly agree on the functionality and the point when it is finished. They define the point where the “working software” is delivered!
But when is the developer is finished? Every developer has a compulsion to write the most perfect, re-usable, standardised piece of software that implements all new cutting edge frameworks and complies with the latest architectural standards. But progress has to be made. This implies that the developer has to work pragmatically and stop when the minimal requirement-goals are met.
When do you stop coding?
The trick here is that you have to know what “working” means. There must be acceptance criteria, a thorough test-script and a dataset to test with. Developers, together with the end-users (assisted by testers) define the test-script and test-set. The end-user specifies the tests and they jointly create the test-set. In this case “working software” can be measured when the test-set is correctly processed by the application. At the first time this point is reached the developer is done coding!
New features, bugs and improvements first have an effect on the test-set. The additional test, used to test the new feature, is added to the test-set and the developer can stop coding when this additional test is processed successfully.
This is a great halt on the imagination of the creative developer who inclines to invent new functionality, additional checks and nice features. He knows when he is finished. The code is done when the test is successful and not when all checks/validations he is able to imagine are implemented.
The feature is finished when the tests are successful. This is a binary moment and will avoid the 90% finished syndrome. Now the answer is simple: the feature is working or not!
Continuous build and regression testing
Another advantage of this working method is test automation. The test-scripts and test-sets can be used to feed test automation software. This way you can execute the tests automatically. A build automation tool (Continuum, Hudson) makes it is possible to re-test your whole application every time a developer changes the codebase. This way you have regression testing implemented in you whole system.
How to measure progress?
The only way to measure progress is to measure the amount of work to-be-done. For every sprint (iteration) the team agrees upon a certain backlog of work. This is the amount of work that the team thinks it can deliver in the sprint (eg. month). The amount of work to-be-done is the backlog. This can be measured at any time during the sprint.
When you plot the amount of work to-be-done on a day to day basis you will get a good overview of the progress. This is called a burn down chart. Below you will find a sample of a burn down chart.
You can see in the graph above that the progress halted during the 15'th and the 27th of may. Both delays where eliminated by the end of the sprint.
Changes in progress in different sprints
Now we have a means to measure progress within a sprint. We can compare different sprint-runs with each other. We would expect that the team, during the project, improve its performance. Team members become more aligned to each other and the whole development machine get’s running. The team’s “velocity” becomes the means to measure improvement in achieving the expected progress (or even above expectation).
At the beginning of a sprint the team agrees upon the amount of work it can process during the sprint. At the end of the sprint the actual work done is measured. During the evaluation of every sprint the things that went well are replicated and the things that did not improved performance are eliminated. The team will strive for a higher velocity in the next sprint.
|Sprint 1: 20 days||Sprint 2: 20 days|
Work planned: 100
Work planned: 90 (reduced number due to lower velocity last sprint).
Evaluation: difficult testing (lot of time) due to poor test data. Improve test-data next sprint.
Velocity = 4 (80/20)
Evaluation: improved test-data works!
The velocity is the measure of the productivity of the team and helps the team to improve the estimation of the next sprint. The beauty of this method is that it is continuously evaluating. Opposed to the traditional hours per functionpoint, that are set once a year, the team can adjust the performance every sprint by evaluating the data of the last sprint. Progress in the team’s collaboration and productivity can easily be measured in this way,
The only way to measure progress is “Working software”. This is the only point the end-user is able to measure the progress of the development team. Working software is defined as tested software that delivers value to the end-user. Good test-scripts and test-data are crucial in supporting this process. These test scripts can also be used for (regression) test automation.
During the sprint (or even the project) you can measure progress by plotting the work to-be-done in a burn down chart. The improvement of the team’s performance can be measured via the velocity.
So managing an agile project is not like driving a car blindfolded at night. There are good tools to measure and report progress. And if you are successful, you can report more progress compared to projects executed with a traditional methods. Good luck!
Other posts about the AGILE Principles (soon to come):
- Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
- Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
- Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale.
- Business people and developers must work together daily throughout the project.
- Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
- The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
- Working software is the primary measure of progress.
- Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.
- Continuous attention to technical excellence and good design enhances agility.
- Simplicity–the art of maximizing the amount of work not done–is essential.
- The best architectures, requirements, and designs emerge from self-organizing teams.
- At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.