Originally published in the Naval Review, August 2017
In March 2017 Melissa Dahl penned an article for NYMag.com entitled Everything Always Takes Longer Than You Think[i]. She highlights a body of work, notably by Daniel Kahneman and Amos Tversky, that reveals a cognitive bias known as the Planning Fallacy. Essentially, this bias leads us to be overly optimistic about how long tasks take to achieve. In fact, as Oliver Burkeman wrote in the Guardian in 2008[ii], this effect is present even when you are aware of the fact that the task you’re embarking on will take longer than you think and so you adjust your estimate accordingly (known as Hofstadter’s Law). Burkeman proposes that, counter-intuitively, we should do less detailed planning (since the effort is wasted in any case) and just get on with it. In the extreme he suggests that one should eschew planning altogether. This is probably ok for relatively simple, or at least non-complex, tasks such as sorting our car insurance or writing a newspaper article, but really becomes inadequate when trying to manage a complex programme of work. But, he is right insomuch as the reductionist approach to managing complexity repeatedly fails to accurately forecast completion dates. The same applies to costs by the way which, in some circumstances, can be traded against time (or quality) as captured by this nifty internet meme:
We can all think of plenty of examples: The Millennium Dome, Wembley Stadium, the London Olympics (on time and brilliant, but significantly over the original cost estimate[iii]), various large scale IT programmes, the Sydney Opera House, the Scottish Parliament and, yes, a whole bunch of high profile Defence acquisition programmes[iv], and so forth. Now I am specifically not blaming the managers of these programmes or pointing the finger at anyone. I am a systems thinker when it comes to the analysis of such problems and try hard to take a ‘Black Box Thinking[v]’ approach, as described by Matthew Syed in his superb book of that name. Few people in any profession go to work each day to do a bad job, and most are keen to do a great job. Where common themes emerge, such as this, one must not look at the individuals but at the inherent nature of the systems in which they operate if we want to make things better. That is not to say that we should shy away from the brutal hard facts of failure, [vi] and we must look hard at what could be done better in both managing realistic expectations and optimisation of programme delivery. Satisfaction is to be found where the two coincide! I shall stick to an analysis of the former. There are a number of factors at play here: firstly, managing the work is not forecasting. Burkeman’s exhortation not to plan is at odds with the famous, and oft-quoted, wisdom from successful industrialist Sir John Harvey-Jones:
“Planning is an unnatural process; it is much more fun to do something. The nicest thing about not planning is that failure comes as a complete surprise, rather than being preceded by a period of worry and depression.[vii]”
Can we reconcile these 2 views? We can if we realise that the reliable forecasting of a programme’s outturn is a subtly different thing to the managing of the work necessary to deliver it. A reductionist approach is absolutely necessary to break down a large, complex task into manageable work packages and jobs that can be subcontracted out or allocated to different divisions of the workforce, down to individual workers. These tasks need to be coordinated, resourced and the interdependencies managed. Planning will help to achieve this but the more complex the programme, the more fragile that plan will be to contact with reality. That is not to say that the generation of the plan is a futile activity, in fact (as in war) it is wholly necessary:
But the plan cannot be an indicator of the outcome, for several reasons. Firstly, the plan will contain only what is known, and what is known about what is known. That is to say that we might understand that a certain task needs to be performed and have a good understanding of the time and cost of that activity. To borrow from Rumsfeldian Logic, let us call this a known-known. A known-unknown would be something that we knew needed to be done but (perhaps because we were doing it for the first time) were unsure about the resource (time/cost/etc) necessary to achieve it. Assuming that we can satisfactorily quantify this uncertainty, it can be classified as a risk and managed accordingly. The trouble, of course, is that the interaction between risks is a matter of complexity (more of which in a moment). What the plan completely fails to account for are the unknown-knowns and the unknown-unknowns. Unknown-knowns are those things that we don’t know need to be done, and will emerge over time, but once we know they need to be done we have a reasonable idea of how long it might take – the need to re-work some element of the programme, for instance. Since we’ve done it before we should have a reasonable idea of how much resource (time and money) it will consume. Finally, there are the unknown-unknowns; those things that we did not expect to have to do and have little idea about their scope. In either case, it’s not just that these things are unknown but in a complex programme they are, in fact, unknowable. This is not, therefore, incompetence or ineptitude on the part of those running the programme but rather an epistemological limit on the certainty with which the programme can be planned. Where there is a flaw it is overconfidence in the reliability and completeness of the plan itself.
The business of the unpredictability of complex systems (including such things as complex programmes) is dealt with by the incomparable Nassim Nicholas Taleb in his various books. In the prologue to Antifragile[viii], he observes that:
“Complex systems are full of interdependencies – hard to detect – and nonlinear responses…… Man-made complex systems tend to develop cascades and runaway chains of reactions that decrease, even eliminate, predictability and cause outsized events.”
He goes on, in a later chapter, to specifically apply this to the business of programme management. Observing that there is an “obvious asymmetry” in projects, especially ones that are IT heavy, where uncertainty drives the programme in only one direction i.e. towards increased time and cost: “So, on a timeline going from left to right, errors add to the right end, not the left end of it[ix].”
Obviously, it is much easier to forecast how long something will take if you’re not doing it for the first time. You will have a much clearer understanding of the length of time it will take to build the tenth of a class of ships, for example, as opposed to the first which is in essence a prototype, simply because there is a great deal less uncertainty.
There is also the challenge of resolution. As an analogy, if we measure the coastline of an island using a relatively small scale map we are likely to underestimate the actual distance that we would need to walk to circumnavigate the coastline of that island. Why? Because when measuring on the map we will smooth out the meanderings of the costal path such that when we measure, by walking, in the actual resolution of our stride length we will discover that more steps are required than those that we estimated from the map. The more complicated the coastline, the greater our error will be. And so it is with trying to predict the outcome of a complex programme from the plan – it will always take more steps than you estimated to navigate the intricacies of reality that were smoothed out in the planning phase. So, if there are immutable limits on our ability to predict, against uncertainty, from a reductionist analysis of our programme of interest, are there better methods of forecasting?
Philip Tetlock has a long career of researching the art and science of prediction and has written about it in his excellent book Superforecasting (co-authored with Dan Gardner). His research shows that people who are good at forecasting tend to use a common set of techniques:
- Unpack the question into components.
- Distinguish as sharply as you can between the known and the unknown and leave no assumption unscrutinised.
- Adopt the outside view and put the problem into a comparative perspective that downplays its uniqueness and treats it as a special case of wider class of phenomena.
- Then adopt the inside view that plays up the uniqueness of the problem.
- Also, explore the similarities and differences between your views and those of others – and pay special attention to prediction markets and other methods of extracting wisdom from crowds.
- Synthesise all these different views into a single vision.
- Finally express your judgement as precisely as you can using a finely grained scale of probability[x].
Where many programme forecasts seem to come unstuck is the inability to take the ‘outside perspective’. This means looking at other comparable programmes and seeing how similar or different your particular programme is from that. Say you were initiating a programme to develop and manufacture a brand-new model of airliner. A good outside view would be to look at previous such programmes (across the industry) and see how far out their original estimates of time and cost were. If they were all, say, between 50-100% underestimated at the start of the programme it would be sensible to add this margin to your own estimate from inside the programme. The staging of Olympic Games is another intriguing example. Games overrun (in cost) with 100% consistency with a real-terms average of 179%[xi]. So why don’t Olympic Games organisers simply multiply their original estimate by 1.8 times when submitting their bids? Well the answer, surely, is that they want to win the bid! And by “they” I mean everyone involved, including those that provide the money which is, of course, predominantly the Government whose money comes from the taxpayer (who also want the bid to be won, but might not have realised that they are collectively underwriting the cost). Actually, for all public-sector programmes, and many large corporations, there is a theme here which Taleb describes as the ‘Agency Problem[xii]’. This means that the person making the decision is not the ‘owner’ and thus isn’t actually on the hook to pay the cost of any hidden risk or expense that might become apparent at a later date. It is much easier to write cheques that other people will have to cash. That is not to say, of course, that winning the bid to host the Olympic Games or embarking on a major change or acquisition programme isn’t the right thing to do, but we have a choice about how optimistic or realistic we wish to be about the costs involved. And a true forecast of costs, derived from Tetlock’s advice, might seem unaffordable. Such unaffordability will inevitably be challenged, especially when the reductionist method can be used to gloss over the inherent uncertainty, smooth out the meandering path and ignore the derived heuristics from relevant past experience. The programme thus gets the green light but there is an inevitable and unpleasant surprise around the corner.
Once such surprises manifest it is convenient to blame the contractors or the teams running the programme as VAdm Bob Cooling did recently in The News[xiii], Portsmouth’s local newspaper. Admiral Cooling identified specific problems in time, cost and performance in Defence acquisition programmes, and noted the similarities between currently running programmes and past ones. What the Admiral didn’t describe was that it is the failure of forecasting, as a result of the inherent and unaccounted for uncertainty in complex programmes, that is at the heart of these delays. That they take longer or cost more than expected is largely as a result of unrealistic expectations rather than ineptitude or fecklessness on the part of those delivering the programme.
So, what is to be done? Well, assuming that you genuinely want to stop embarking on projects which are outside of your budget (be it time or cost) an incentive structure needs to be created that avoids the Agency Problem, where decision makers have genuine ‘skin in the game’ and stand to lose if the decisions they make turn out to have hidden costs that were not accounted for at the time (in the form of uncertainty or risk). If you’re taking significant personal risk, you will allow for a greater margin of error! The problem here is that the organisation is, effectively, contracting out the risk to the individual who would face sanctions if this were to manifest. Such a system has been put in place for financial institutions where, following the 2007/8 crisis:
“Under section 36 of the UK Financial Services (Banking Reform) Act 2013, it is a criminal offence for a senior manager in a financial institution to make a decision that causes that institution, or any other financial institution which is a member of the same group, to fail[xiv].”
This offence can be punished with up to 7 years in prison or an unlimited fine. It would be rational for individuals charged with such responsibility to be paid to carry it, and in financial institutions they are. For Government and other industry sectors, contracting out the risk of cost or programme overruns to individuals or companies would be significantly more expensive (at least up front) than ‘The Crown’ or Shareholders carrying the risk. But some system could, potentially, be devised of delayed incentivisation such that those who approve a programme’s time/cost/performance parameters are held accountable for the final outturn. For programmes that run over decades this may be completely unworkable and, in any case, I detect little appetite for this in Government Departments so we may simply have to live with the consequences.
In conclusion then, I set out to explain “why everything takes longer than you think”. There are cognitive biases at play here, notably the Planning Fallacy, but it is the inherently uncertain nature of complex programmes that makes them so hard to predict. And given the fact that only certain natures of information about a programme can be captured and assessed for risk, the effect as other information is revealed will (virtually) always be to increase the time required and/or cost of the programme, or to decrease the quality of the product. Methods do exist, however, that would make our predictions more accurate but the unwelcome increase in time and cost exposed by such forecasts, and the fact that they are not underpinned by ‘known’ information, means that they are largely unused in circumstances where the Agency Problem is prevalent. Changing the incentive structure so that decision makers at a programme’s initiation (and at key milestones along the way) are held tangibly accountable for the programmes outturn would improve such decision making but at an upfront cost – why would anyone rationally accept having such skin in the game for free? It seems to me, therefore, that ‘things taking longer than you think’ is just something that we will need to learn to live with – a problem to be managed, not a puzzle to be solved. But at least you now know why.
 Please read it, it is genuinely brilliant.
 “[Failure] can be thought of as the gap between what we hoped would happen, and what actually did happen.” vi
 Actually, the difference between a known-known and a known-unknown is continuum rather than binary scale, but it works for the purposes of illustrating the point – high certainty vs low certainty.
 Fooled by Randomness, The Black Swan and Antifragile.
 London 2012 was better than average with only a 101% cost overrun in real terms!
 Which were saw above are tradeable against each other.
 Especially complex programmes.
[i] Dahl, M. (2017). Everything Always Takes Longer Than You Think. [online] Science of Us. Available at: http://nymag.com/scienceofus/2017/03/why-everything-always-takes-longer-than-you-think.html?mid=full-rss-scienceofus [Accessed 24 Apr. 2017].
[ii] Burkeman, O. (2017). Oliver Burkeman on why everything takes longer than you think. [online] the Guardian. Available at: https://www.theguardian.com/lifeandstyle/2008/aug/02/healthandwellbeing.psychology [Accessed 24 Apr. 2017].
[iii] Newstatesman.com. (2017). London Olympics exceed initial budget by £6.52bn. [online] Available at: http://www.newstatesman.com/economics-blog/2012/10/london-olympics-exceed-initial-budget-652bn [Accessed 24 Apr. 2017].
[iv] Harding, T. (2017). Nimrod destruction cost taxpayer £3.4bn as MoD ignored ‘cost implications’, MPs say. [online] Telegraph.co.uk. Available at: http://www.telegraph.co.uk/news/uknews/defence/9072073/Nimrod-destruction-cost-taxpayer-3.4bn-as-MoD-ignored-cost-implications-MPs-say.html [Accessed 24 Apr. 2017].
[v] Syed, M. 2015. Black Box Thinking, London: John Murray (Publishers).
[vi] Ibid., p.55.
[viii] Taleb, N.N. 2013. Antifragile: Things That Gain From Disorder, London: Penguin Books. p.7
[ix] Ibid,. p.285.
[x] Tetlock, P. and Gardner, D. 2016. Superforecasting: The Art and Science of Prediction, London: Random House Books. p153.
[xi] Flyvbjerg, B. and Stewart, A. (2017). Olympic Proportions: Cost and Cost Overrun at the Olympics 1960-2012. 1st ed. [ebook] Oxford: Saïd Business School, University of Oxford, p.3. Available at: http://eureka.sbs.ox.ac.uk/4943/1/SSRN-id2382612_(2).pdf [Accessed 24 Apr. 2017].
[xii] Ibid., Taleb. p430
[xiii] Portsmouth.co.uk. (2017). Royal Navy admiral blasts delays in delivering top military projects. [online] Available at: http://www.portsmouth.co.uk/our-region/portsmouth/royal-navy-admiral-blasts-delays-in-delivering-top-military-projects-1-7919271 [Accessed 24 Apr. 2017].
[xiv] http://www.hoganlovells.com. (2017). Criminal liability for bank directors? A look at the United Kingdom and South Africa. [online] Available at: https://www.hoganlovells.com/en/publications/criminal-liability-for-bank-directors-a-look-at-the-united-kingdom-and-south-africa [Accessed 24 Apr. 2017].