Adapt – Why success always starts with failure
By Tim Harford
(Summarised by Paul Arnold – Trainer & Facilitator – firstname.lastname@example.org)
Content = **** Readability = ** Clarity & Structure = **
THE BOOK IN A NUTSHELL
In the increasingly complex world we live in, constant adapting (and accepting failure as part of the learning curve) is the most effective way to solving complex issues. The book postulates the Darwinian principles of evolution; variation and survival of the fittest are the way markets develop. Thus our attempt at long term market planning should be switched to constant testing of variations (and to quickly adapt accordingly).
1) We live in an increasingly complex world (which are difficult to control).
-In NY or London you can buy over 10 billion distinct products. There are over 100,000 distinct products for sale in a Wal-Mart. Starbucks claim to sell 87,000 different beverages.
-It is beyond the ability to any one person to be able to control even small things.
-In complex systems, there are a vast multitude of variables (all inter-related) that can create millions of different outcomes that are in a constant state of flux.
-Thus it makes it increasingly difficult to successfully make detailed plans for the future (tomorrow ≠ yesterday). Instead, we need to build-in speed and flexibility of response.
-In the late 1920’s, Peter Palchinsky, a Russian Engineer was sent to study the proposed Lenin Dam and the Steel Mills of Magnitogorsk, both of which he criticised. His principles for development of complex projects were:
1) Try new things (and expect that some will fail)
2) Test them on a scale where failure is survivable
3) Seek feedback and adapt accordingly
-In such complex market we come to over-rely on experts, living with a false fantasy that there is a simple solution to complex problems or that a ‘white knight’ can somehow solve all the problems (part of Barrack Obama’s problems have been due to the misplaced expectation of what one man can do).
-Research by Tetlock into ‘experts’ has revealed that experts perform just slightly better than average in predictions. It appeared a deeper level of knowledge in an area did not make them better at predicting (but made them confident/persuasive in their predictions. Thus our over reliance on experts to simplify and make sense of the complex world may be misplaced.
-Complexity ironically requires less planned, expert designed strategies, but rather a simple process of trying out lots of different variants.
-There is a dangerous myth of control – not only that one can control everything but also that one must control everything.
Complex systems cannot be controlled – instead one needs speed and flexibility of response to meet the dynamic environment.
-One of the issues about centralised control is that it leads to often irrelevant operational strategies for a number of reasons:
1) The information is untimely (in that it takes too long to get up to head office, too long to interpret and too long to send back out again – by which time often the local situation has moved on).
2) The information is too macro – Taking local information up to macro levels often looses the details – so the resulting strategies are not relevant.
-Instead we need to divorce objective setting (head office) and strategy (localised). Timpson shoe repairers have empowered the front line staff to make all major decisions back to the shop level (inc stock, display and customer warranties up to £500).
-Another problem is that all this data surrounding those in head office can create the illusion of total knowledge. This again leads to over confidence in one’s decision-making and can lead to suppression of dissenters and ignoring/discounting information that does not fit one’s model of reality (cf Robert McNamara, the failed architect of the Vietnam war who relied heavily of computerised data gathering). Thus we need to encourage dissenters in an organization as they bring with it a variety of perspective and solutions.
-Charles Perrow, Emeritus professor of sociology at Yale University studied many major man made disasters (e.g. the Piper Alpha oil rig disaster of 1988 where 167 people died). He concluded that two key factors underpin most major disasters:
1) They occur in highly complex systems (where many things can go wrong) and
2) The systems are tightly coupled (a bit like a dominoes effect where one event in one part of the system triggers other parts in the system, making its impact spread rapidly and hence be very difficult to stop). The economic meltdown in 2008 obeyed these two principles.
-The natural reaction is to build in as many control systems as possible. The issue is for complex systems they tend not to help. As previously mentioned, trying to control complex systems actually leads to unexpected consequences, thus often making the matters worse. For example on the Piper Alpha disaster, control systems to stop people working long shifts, and a safety device to prevent divers being sucked into the seawater pumps compounded the problems. Thus safety devices add to the complexity of a system (and create more opportunities for things to go wrong).
-One of the other side effects of such control systems is they lull you into a sense of false security which can lead to increased risk taking (e.g. the economic crisis was triggered by sub prime mortgages where the risk was spread, thus encouraging lenders to take on greater amount of such mortgages).
2) Trail & Error and Variation is key to thrive and survive
-In business (as in our personal life), trial and error is one of the most effective and common ways that we learn and grow.
-Indeed, Darwin’s principles of evolution are built on the premise of different varieties are created, some of which survive as they are more suited to that environment (but when the environment changes, so different varieties thrive).
-Variety drives speed of market evolution as one company builds on the advances made by others.
-Open technology has allowed the i-phone to become the most important high-end phone in the market as it sparked a wide variety of app development. Likewise, ‘crowd sourcing’ is driving creative explorations. Whilst most will fail, the cost of such failure is dwarfed by the benefits that a variation brings.
-Certain countries have focused on key sectors, and then encouraged lots of local activity in that area, as the internal market competition has driving their expertise and world competitiveness – e.g. Chile’s focus on Salmon farming and Taiwan with orchids.
-Other companies have encouraged internal variation via the 20% rule (where 20% of their time can be spent working on anything they feel would be useful). Over 50% of all of Google’s new products have come from these 20% projects including Google News, Google Suggest, Adsense and the social networking site, Orkut. Likewise they have also produced many failures but they see that as a critical part of new product evolution (they expect an 80% failure rate).
-The trouble is that innovation is becoming more complex, more costly and more timely. This is meaning that many companies are instead focusing their R&D in just a few project areas. This requires a decision filtering process that usually includes ROI and risk assessment. This filters out variety and the more riskier projects.
-The National Institute of Health commissions many US scientific projects. Due to government funding, its expert-led, result oriented task force ends up focusing on risk averse projects. In comparison, the Howard Hughes Medical Institute urges scientist to take risks (and backs people rather than specific projects thus allowing them to redirect their focus as new information comes alight). When comparing both institutions, HMI outperforms NIH on almost every criteria. So in taking risks and accepting failure, they achieve greater success than a more risk adverse strategy. The reality is the bigger wins covers the increased risk of failures.
-Innocentives is open source network that encourages parallel development of many solutions towards a defined problem. Bill Gates foundation has offered a $1.5bn reward for the development of effective vaccines for pneumonia, meningitis and bronchitis. This is because there is not a strong enough market based reward for the major drug companies for them to focus their energies in this area.
3) Fear of failure
A critical part of trial and error is that fact that often we will fail. However, we learn from these failures, adapt and slowly get better at it.
When one looks at highly successful people, one often finds they have spent over 10 years, trying (and failing) their way to excellence.
-Failure has been a constant throughout business. Just two years after Peters & Waterman released their seminal book, ‘In search of excellence’ almost 1/3rd of the cited exemplars were in serious financial trouble. Only 1% of the 2,000 US car manufacturers have survived. Today 10% of American companies fold each year. 90% of all NPD projects fail. The IT market, the most successful industry of the past 40 years has been built on failure after failure. Thus failure is part of market evolution.
-The issue is many cultures eschew failure as a key part of growth (cf ENRON). Their corporate financial objectives and general risk adverse management style do not embrace the concept of failure. Hence the growth of skunk works (1st used by Lockheed) where a small unit operates outside of the normal culture of the organisation.
-One of the problems with failure (as developed by the Behavioural Economics experts, Kahneman and Tversky) is that fear of loss can trigger poor decision-making (“a person who has not made peace with his losses is likely to accept gambles that would be unacceptable to him otherwise”).
4) Resisting adapting
-Feedback is key to development. Without it, one would not know how to improve (yet often in management the feedback is poorly timed – how can a golfer improve his swing if he only gets feedback 12 months later?).
-The issue is, even with feedback, people tend to resist change (both consciously and unconsciously).
-As technologies change, the old dominant companies find it hard to change. They tend to discount the importance of the new technology based on consumer feedback and market projections. But the small visionary start-ups who produce the embryonic versions quickly learn from each other, and when the market is ready, they capitalise on it, leaving the major manufacturers desperately trying to catch up. Often they fail because their deep expertise in one field is no longer applicable in the new field.
-As complexity and speed of change continues to grow, organisations (and ourselves as individuals) need to more quickly ‘read the signs’ and swiftly adapt in order to survive.
I found this book a difficult (but rewarding) read. It was repetitive, disjointed in parts and difficult to follow the logic flow. It does use some engaging stories (which for brevity I have had to cut-out) and makes a very salient point about trial and error as a successful route to growth.
Ironically, his emphasis on complexity actually undermines his treatise. He tended to be too black and white about many of his examples when the reality is often much more complex and less clear cut. Likewise, is he not arguing against himself to imply that by simply developing a trial and error model we can right many of the ills of the past?