The Agile Executive

Making Agile Work

The Changing Nature of Innovation: Part I — New Forms of Experimentation

with 4 comments

Colleague Christian Sarkar drew my attention to two recent Harvard Business Review (HBR) articles that shed light on the way(s) innovation is being approached nowadays. To the best of my knowledge, none of the two articles has been written by an author who is associated with the Agile movement. Both, if you ask me, would have resonated big time with the authors of the Agile Manifesto.

The February 2009 HBR article How to Design Smart Business Experiments focuses on data-driven decisions as distinct from decisions taken based on “intuition”:

Every day, managers in your organization take steps to implement new ideas without having any real evidence to back them up. They fiddle with offerings, try out distribution approaches, and alter how work gets done, usually acting on little more than gut feel or seeming common sense—”I’ll bet this” or “I think that.” Even more disturbing, some wrap their decisions in the language of science, creating an illusion of evidence. Their so-called experiments aren’t worthy of the name, because they lack investigative rigor. It’s likely that the resulting guesses will be wrong and, worst of all, that very little will have been learned in the process.

It doesn’t have to be this way. Thanks to new, broadly available software and given some straightforward investments to build capabilities, managers can now base consequential decisions on scientifically valid experiments. Of course, the scientific method is not new, nor is its application in business. The R&D centers of firms ranging from biscuit bakers to drug makers have always relied on it, as have direct-mail marketers tracking response rates to different permutations of their pitches. To apply it outside such settings, however, has until recently been a major undertaking. Any foray into the randomized testing of management ideas—that is, the random assignment of subjects to test and control groups—meant employing or engaging a PhD in statistics or perhaps a “design of experiments” expert (sometimes seen in advanced TQM programs). Now, a quantitatively trained MBA can oversee the process, assisted by software that will help determine what kind of samples are necessary, which sites to use for testing and controls, and whether any changes resulting from experiments are statistically significant.

On the heels of this essay on how one could attain and utilize experimentally validated data, the October 2009 HBR article How GE is Disrupting Itself discusses what is already happening in the form of Reverse Innovation:

  • The model that GE and other industrial manufacturers have followed for decades – developing high-end products at home and adapting them for other markets around the world – won’t suffice as growth slows in rich nations.
  • To tap opportunities in emerging markets and pioneer value segments in wealthy countries, companies must learn reverse innovation: developing products in countries like China and India and then distributing them globally.
  • While multinationals need both approaches, there are deep conflicts between the two. But those conflicts can be overcome.
  • If GE doesn’t master reverse innovation, the emerging giants could destroy the company.

It does not really matter whether you are a “shoe string and prayer” start-up spending $500 on A/B testing through Web 2.0 technology or a Fortune 500 company investing $1B in the development and introduction of a new car in rural India in order to “pioneer value segments in wealthy countries.” Either way, your experimentation is affordable in the context of the end-result you have in mind.

Fast forward to Agile methods. The chunking of work to two-week segments makes experimentation affordable – you cancel an unsuccessful iteration as needed and move on to work on the next one. Furthermore, you can make the go/no-go decision with respect to an iteration based on statistically significant “real time” user response. This closed-loop operational nimbleness and affordability , in conjunction with a mindset that considers a “failure” of an iteration as a valuable lesson to learn from, facilitates experimentation. Innovation simply follows.

4 Responses

Subscribe to comments with RSS.

  1. In essense, the world is speeding up (…particularly as noted by you for software development). However, the human ability to digest, analyze and apply that increase of data is questionable as noted by HBR’s observations re: the typical decision making process of managers. If anything, I believe business decision making is getting sloppier instead of more factually oriented. With increased government oversight into business proceedings, increased regulation and compliance requirements, and the tendency for emphasis on short term gains vs long term objectives …the longer term outlook for effective business decisions is disquieting. In our effort to speed up, it seems we are becoming more short sighted …and hence riskier in business dealings.

    Bill Keyworth

    November 28, 2009 at 2:28 pm

    • I consider greed part of human nature (myself included). What I am mystified by is the accentuation of short term greed. We have a ton of rigorous research by Carlota Perez on the folly of pursuing immediate returns. It seems like we are unable to learn from history…




      November 29, 2009 at 8:19 pm

  2. The interesting dichotomy in the greed discussion is the apparent short term emphasis that is driving faulty long term business decisions at the precise time that we most need objective sanity, experienced perspective and analytic discipline. ClimateGate is yet another illustration of short term myopia at the expense of longer term economics. Are we expecting IT to buck this global trend towards the end (business profits) to justify the means (short term IT investment mentality)?

    Bill Keyworth

    November 30, 2009 at 1:14 pm

    • The rules of physics apply. To be more precise, the rules of software engineering apply. As observed by Boehm, defects fixed during the Operations phase can be two orders of magnitude more expensive to fix than during the Requirements phase, one order of magnitude more expesive to fix than during the Code phase. Ignoring these findings is like the quip about economics: “You might ignore economics, but it would not ignore you…”



      November 30, 2009 at 11:48 pm

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: