Archive for the ‘Agile Performance Management’ Category
Standish Group Chaos Reports Revisited
The Standish Group “Chaos” reports have been mentioned in various posts in this blog and elsewhere. The following figure from the 2002 study is quite representative of the data provided in the Standish annual surveys of the state of software projects:
The January/February 2010 issue of IEEE Software features an article entitled The Rise and Fall of the Chaos Report Figures. The authors – J. Laurenz Eveleens and Chris Verhoef of the VU University, Amsterdam – give the following summary of their findings:
In 1994, Standish published the Chaos report that showed a shocking 16 percent project success. This and renewed figures by Standish are often used to indicate that project management of application software development is in trouble. However, Standish’s definitions have four major problems. First, they’re misleading because they’re based solely on estimation accuracy of cost, time, and functionality. Second, their estimation accuracy measure is one-sided, leading to unrealistic success rates. Third, steering on their definitions perverts good estimation practice. Fourth, the resulting figures are meaningless because they average numbers with an unknown bias, numbers that are introduced by different underlying estimation processes. The authors of this article applied Standish’s definitions to their own extensive data consisting of 5,457 forecasts of 1,211 real-world projects, totaling hundreds of millions of Euros. The Standish figures didn’t reflect the reality of the case studies at all.
I will leave it to the reader to draw his/her conclusion with respect to the differences between the Standish Group and the authors. I would, however, quote Jim Highsmith‘s deep insight on the value system within its context we measure performance. Following excerpt is from Agile Project Management: Creating Innovative Products:
It we are ultimately to gain the full range of benefits of agile methods, if we are ultimately to grow truly agile, innovative organizations, then, as these stories show, we will have to alter our performance management systems…. We have to be as innovative with our measurement systems as we are with our development methodology.
See pp. 335-358 of Jim’s book for details on transforming performance management systems. His bottom line is elusively simple:
The Standish data are NOT a good indicator of poor software development performance. However, they ARE an indicator of systemic failure of our planning and measurement processes.
Jim is referring to the standard definition of project “success” – on time, on budget, all specified features.
I will be working with a client to carry out the performance management ideas articulated by Jim later this month. Jim indicated he has a customer engagement in February where he expects to learn about interesting ways in which the client is using the Agile Triangle (which is conceptually quite related to the fundamental question what to measure). Client confidentiality permitting, I am confident we will soon be able to brief readers of The Agile Executive on our progress.
Cutter’s Technical Debt Assessment and Valuation Service
with 3 comments
The Cutter Consortium has announced the availability of the Technical Debt Assessment and Valuation Service. The service combines static code analytics with dynamic program analytics to give the client “x-rays” of the software being examined at any desired granularity – from the whole project portfolio to a single instruction. It breaks down technical debt into the areas of coverage, complexity, duplication, violations and comments. Clients get an aggregate dollar figure for “paying back” debt that they can then plug into their financial models to objectively analyze their critical software assets. Based on these metrics, they can make the best decisions about their ongoing strategy for the software development effort under scrutiny.
This new service is an important addition to the enlightened software governance framework that Jim Highsmith, Michael Mah and I have been thinking about and contributing to for sometime now (see Beyond Scope, Schedule and Cost: Measuring Agile Performance and Quantifying the Start Afresh Option). The heart of both the technical debt service and the enlightened governance framework is captured by the following words from the press release:
By boiling down technical debt to dollars and tying it to cost and value, the service enables a metrics-driven governance framework for the use of five major constituencies, as follows:
It should finally be pointed out that the technical debt assessment service and the governance framework it enables are applicable to any software method. They can be used to:
Forthcoming Cutter Executive Reports, Executive Updates and Email Advisors on the technical debt service are restricted to Cutter clients. As appropriate, I will publish the latest and greatest news on the subject in the Cutter Blog (which is an open forum I highly recommend).
Acknowledgements: I would like to wholeheartedly thank the following colleagues for inspiring, enlightening and supporting me during the preparation of the service:
Written by israelgat
May 5, 2010 at 4:40 am
Posted in Agile Performance Management, Companies, Events, Performance Measurement, Software Costs
Tagged with Anne Mullaney, CEO, Chris Sterling, Cindy Swain, CIO, Comments, Complexity, Coverage, CTO, Cutter Consortium, Duplication, Governance Framework, Industry Norms, Israel Gat, IT Operations, Jennifer Flaxman, Jim Highsmith, John Heintz, Jonathon Golden, Kara Letourneau, Karen Coburn, Ken Collier, Kim Leonard, M&A, Michael Mah, NPV, Paying Back, Software Method, Technical Debt, Valuation, Venture Capitalist, Violation