Posts Tagged ‘The Agile Triangle’
Schedule Constraints in the Devops Triangle
Last week’s post “The Devops Triangle” demonstrated the extension of Jim Highsmith‘s Agile Triangle to devops. The extension relied on adding compliance to the three traditional constraints of software development: scope, schedule, cost. A graphical representation of this extension is given in Figure 1.
Figure 1: Compliance as the Fourth Constraint in Devops Projects
This blog post examines how time/schedule should be governed in the devops context. It does so by building on the concluding observation in the previous post:
The Devops Triangle and the corresponding Tradeoff Matrix demonstrate how governance a la Agile can be extended to devops projects as far as compliance goes. The proposed governance framework however is incomplete in the following sense: schedule in devops projects can be a much more granular and stringent constraint than schedule in “dev only” projects.
For the schedule constraint in devops, I propose a schedule set. It consists of four components:
- Lead Time or Engineering Time
- Time to change
- Time to deploy
- Time to roll back
Lead Time/Engineering Time: These are customary metrics used in Kanban software development, as demonstrated in Figure 3.
Figure 3: The Engineering Time Metric Used by the BBC (David Joyce in the LSSC10 Conference)
Time to change: The amount of time it takes for the various stakeholders (e.g., dev, test, ops, customer support) to review the code to be deployed, approve its deployment and assign a time window for the deployment.
Time to deploy: The amount of time from (metaphorically speaking) pushing the Deploy “button” to completion of deployment.
Time to roll back: The amount of time to undo a deployment. (Rigorous that the engineering practices and IT processes might be, the time to roll back a deployment can’t be ignored – it is a critical risk parameter).
A graphical representation of these four schedule metrics together with the Devops Triangle is given in the figure below:
Figure 4: The Devops Triangle with a Schedule Set
Using hours as the common unit of measure, a typical schedule set could be {100, 48, 3, 2}. In this hypothetical example, it takes a little over 4 days to carry out the development of the code increment; 2 days to get approval for the change; 3 hours to deploy the code; and, 2 hours to roll back.
Whatever your specific schedule numbers might be, it is highly recommended you apply value stream mapping (see Figure 5 below) to your schedule set. Based on the findings of the value stream mapping, apply statistical process control methods like those illustrated in Figure 3 to continuously improving both the mean and the variances of the four schedule components.
Figure 5: An Example of Value Stream Mapping (Source: Wikipedia entry on the subject)
The Devops Triangle
The Agile Triangles was introduced by Jim Highsmith as an antidote to the Iron Triangle. Instead of balancing development between cost, schedule and scope, the Agile Triangle strives to strike a balance between value, quality and constraints:
Figure 1 – The Agile Triangle (based on Figure 1-3 in Agile Project Management: Creating Innovative Products.)
Consider the Iron Triangle in the context of devops. Value, quality and constraints apply to IT operations as meaningfully as they apply to software development. IT can go beyond cost, schedule and scope to focus on value and quality just as the Agile software development team does. Between development and operations the specific tasks to be carried out change, but the principles embodies in the triangle remain invariant.
In addition to cost, schedule and scope, devops projects must cope with another constraint: compliance. For example, a bank that implements a ‘follow the sun’ strategy with respect to trading must finish reconciling transaction that took place in London before the start of trade in Wall Street. From the bank’s point of view, its IT department needs to be mindful of four constraints: compliance, cost, schedule and scope. This view is represented in Figure 2 below.
Figure 2 – The Devops Triangle
Balancing the four constraints – compliance, cost, schedule, and scope – is not a trivial task. However, just like the Agile Triangle, the Tradeoff Matrix used in Agile software development applies to IT. In its software development variant, the Tradeoff matrix is an effective tool to decide between conflicting constraints, as follows:
Table 1 – Tradeoff Matrix (based on Table 6-1 in Agile Project Management: Creating Innovative Products.)
For devops, the matrix is extended to include a compliance row and a Reluctantly Accept column as follows:
Table 2 – Tradeoff Matrix for Devops
The Devops Triangle and the corresponding Tradeoff Matrix demonstrate how governance a la Agile can be extended to devops projects as far as compliance goes. The proposed governance framework however is incomplete in the following sense: schedule in devops projects can be a much more granular and stringent constraint than schedule in “dev only” projects. The subject of schedule constraints in devops projects will be addressed in a forthcoming post.
How Many Metrics do You Need to Effectively Govern the Software Process?
A Simple Metrics-Driven Software Governance Framework Based on Jim Highsmith’s Agile Triangle Framework
In my recent Cutter Blog post entitled Three Governance Metrics I recommended using just three metrics:
- Value
- Cost
- Technical debt
The heart pf this recommendation is that all three can be expressed in dollar terms as depicted in the figure above. An apples-to-apples comparison is made through the common denominator – $$. For example, something is likely to be either technically, methodically or governance-wise wrong if the technical debt figure exceeds the cost figure for a prolonged period of time. One can actually characterize such a situation as accruing debt faster than building equity.
I am often asked about adding metrics to this simple governance framework. For example, should not productivity be included in the framework?
‘Less is more’ is my usual response to such questions. IMHO value, cost and technical debt address the most important high level governance considerations:
- Value –> Why are we doing the project?
- Cost –> Can we afford the project?
- Technical debt –> Is the execution risk acceptable?
Please pay special attention to the unit of measure of any metric you might add to this simple governance framework. As long as the metric is a dollar-based metric, the cohesion of the governance framework can be maintained. However, metrics which are not expressed in dollars will probably superimpose other frameworks on top of the simple governance framework. For example, you introduce a programming framework if you add a productivity metric which is measured in function points per man month. Sponsors who govern using value, cost, technical debt and productivity will need to mentally alternate between the simple governance framework and the programming framework whenever they try to combine the productivity metric with any of the other three metrics.
A Core Formula for Agile B2C Statrups
Colleague Chris Sterling drew my attention to a Pivotal Labs talk by Nathaniel Talbott on Experiment-Driven Development (EDD). It is a forward-looking think piece, focused on development helping the business make decisions based on actual A/B Testing data. Basically, EDD to the business is like TDD to development.
Between this talk and a recent discussion with Columbia’s Yechiam Yemini on his Principle of Innovation and Entrepreneurship course, a core “formula” for Agile B2C startups emerges:
- Identify a business process P
- Create a minimum viable Internet service S to support P
- Apply EDD to S on just about any feature decision of significance
This core formula can be easily refined and extended. For example:
- Criteria for choosing P could/should be established
- Other kinds of testing (in addition to or instead of A/B testing) could be done
- A customer development layer could be added to the formula
- Many others…
By following this formula a startup can implement the Agile Triangle depicted below in a meaningful manner. Value is validated – it is determined based on real customer feedback rather than through conjectures, speculations or ego trips.
Figure 1 – The Agile Triangle (based on Figure 1-3 in Jim Highsmith‘s Agile Project Management: Creating Innovative Products)
The quip “The voice of the people is the voice of God” has long been a tenet of musicians. The “formula” described above enables the Agile B2C startup to capture the voice of the people and thoughtfully act on it to accomplish business results.
Use the Agile Triangle Instead of the Balanced Scorecard
As the name implies, the Balanced Scorecard strives to strike a balance between various performance measures. When Financial, Customer, Business Processes and Learning and Growth measures are presented together, as in Figure 1 below, the Balanced Scorecard allows managers to view the company from several perspectives at once.
Figure 1 – The Balanced Scorecard (source: Trump University)
Likewise, the Agile Triangle depicted in Figure 2, presents in a single “dashboard” the three dimensions critical to Agile performance measurement – Value, Quality and Constraints. Just as in the Balanced Scorecard, it is easy to see imbalances between the three, to respond to them and to restore balance. For example, the tendency to produce more and more lines of code is held in check through the quality metrics.
Figure 2 – The Agile Triangle (based on Figure 1-3 in Jim Highsmith‘s Agile Project Management: Creating Innovative Products.)
My recommendation to clients who do Agile as a strategic initiative is to drop the Balanced Scorecard and use the Agile Triangle instead. There is precious little, if any, to be gained by using the two in parallel. As a matter of fact, one could easily interfere with the other.
The Learning and Growth dimension of the Balanced Scorecard, which does not explicitly show in the Agile Triangle, is, of course, important. As part of an Agile initiative I would expect Agile proficiency to be closely observed. However, I would not include it explicitly in a system based on the Agile Triangle. Agile proficiency is not and end to itself. If the outputs and outcomes we measure through the Agile Triangle are unsatisfactory over a prolonged period of time, a close examination of the way Agile is practiced is called for.
Standish Group Chaos Reports Revisited
The Standish Group “Chaos” reports have been mentioned in various posts in this blog and elsewhere. The following figure from the 2002 study is quite representative of the data provided in the Standish annual surveys of the state of software projects:
The January/February 2010 issue of IEEE Software features an article entitled The Rise and Fall of the Chaos Report Figures. The authors – J. Laurenz Eveleens and Chris Verhoef of the VU University, Amsterdam – give the following summary of their findings:
In 1994, Standish published the Chaos report that showed a shocking 16 percent project success. This and renewed figures by Standish are often used to indicate that project management of application software development is in trouble. However, Standish’s definitions have four major problems. First, they’re misleading because they’re based solely on estimation accuracy of cost, time, and functionality. Second, their estimation accuracy measure is one-sided, leading to unrealistic success rates. Third, steering on their definitions perverts good estimation practice. Fourth, the resulting figures are meaningless because they average numbers with an unknown bias, numbers that are introduced by different underlying estimation processes. The authors of this article applied Standish’s definitions to their own extensive data consisting of 5,457 forecasts of 1,211 real-world projects, totaling hundreds of millions of Euros. The Standish figures didn’t reflect the reality of the case studies at all.
I will leave it to the reader to draw his/her conclusion with respect to the differences between the Standish Group and the authors. I would, however, quote Jim Highsmith‘s deep insight on the value system within its context we measure performance. Following excerpt is from Agile Project Management: Creating Innovative Products:
It we are ultimately to gain the full range of benefits of agile methods, if we are ultimately to grow truly agile, innovative organizations, then, as these stories show, we will have to alter our performance management systems…. We have to be as innovative with our measurement systems as we are with our development methodology.
See pp. 335-358 of Jim’s book for details on transforming performance management systems. His bottom line is elusively simple:
The Standish data are NOT a good indicator of poor software development performance. However, they ARE an indicator of systemic failure of our planning and measurement processes.
Jim is referring to the standard definition of project “success” – on time, on budget, all specified features.
I will be working with a client to carry out the performance management ideas articulated by Jim later this month. Jim indicated he has a customer engagement in February where he expects to learn about interesting ways in which the client is using the Agile Triangle (which is conceptually quite related to the fundamental question what to measure). Client confidentiality permitting, I am confident we will soon be able to brief readers of The Agile Executive on our progress.