The Week

Why we fail to prepare for disasters

-

Catastroph­es such as the coronaviru­s are all too predictabl­e. What makes us do nothing in the face of danger, asks Tim Harford

You can’t say that nobody saw it coming. For years, people had warned that New Orleans was vulnerable. The Houston Chronicle reported that 250,000 people would be stranded if a major hurricane struck, with the low-lying city left 20ft underwater. The levees were known to be inadequate. In 2004, National Geographic vividly described a scenario in which 50,000 people drowned. The Red Cross feared a similar death toll. Even Fema, the Federal Emergency Management Agency, was alert: in 2001, it had stated that a hurricane hitting New Orleans was one of the three likeliest catastroph­es facing the USA.

Now the disaster scenario was becoming a reality. A 140mph hurricane was heading right for the city. More than a million residents were warned to evacuate. USA Today warned of “a modern Atlantis”, explaining that the hurricane “could overwhelm New Orleans with up to 20ft of filthy, chemicalpo­lluted water”. The city’s mayor, Ray Nagin, begged people to get away, but more than 100,000 people had no cars and no way of leaving. The roads out were jammed. Thousands of visitors were stranded; the airport had been closed. There were no emergency shelters. Nagin mooted using a local stadium, the Louisiana Superdome, as a temporary refuge – but he was warned that it wasn’t equipped to be a shelter.

But then, the storm turned aside. It was September 2004, and New Orleans had been spared. Hurricane Ivan had provided the city, and the nation, with a vivid warning. It had demonstrat­ed the need to prepare, urgently and on a dozen different fronts, for the next hurricane. “In early 2005, emergency officials were under no illusions about the risks New Orleans faced,” explain Howard Kunreuther and Robert Meyer in their book

But they did not act swiftly or decisively enough. Eleven months later, Hurricane Katrina drowned the city – and many hundreds of its residents. As predicted, citizens were unable or unwilling to leave; levees were breached in more than 50 places; the Superdome proved an inadequate shelter.

Paradox.

The Ostrich

Surely, with such a clear warning, New Orleans should have been better prepared for Hurricane Katrina? It’s easily said. But as the new coronaviru­s sweeps the globe, killing thousands more people every day, we are now realising that New Orleans is not the only place that did not prepare for a predictabl­e catastroph­e. In 2003, the Harvard Business Review published “Predictabl­e Surprises:

normalcy bias

negative panic.

The Disasters You Should Have Seen Coming”. The authors, Max Bazerman and Michael Watkins, argued that while the world is an unpredicta­ble place, unpredicta­bility is often not the problem. The problem is that when faced with clear risks, we still fail to act. Back in 2002, Watkins ran a pandemic response exercise at Harvard which is eerily prescient: it posits a shortage of masks; a scramble for social distance; leaders succumbing to the illness. We’ve been thinking about pandemics for a long time.

Other warnings have been more prominent. In 2015, Bill Gates gave a TED talk called “The next outbreak? We’re not ready”; 2.5 million people had watched it by late 2019. In 2018, the science journalist Ed Yong wrote a piece in The Atlantic titled “The Next Plague Is Coming. Is America Ready?” The World Health Organisati­on and the World Bank convened the Global Preparedne­ss Monitoring Board, which published a report in 2019 warning of “a cycle of panic and neglect” and calling for better preparatio­n for “managing the fallout of a high-impact respirator­y pathogen”. It noted that a pandemic akin to the flu in 1918 “would cost the modern economy $3trn”.

“Sars in 2003, H5N1 in 2006, Ebola in 2013, Mers in 2015. Each outbreak sparked alarm, followed by a collective shrug of the shoulders”

Alongside these authoritat­ive warnings were the near misses, the direct parallels to Hurricane Ivan: Sars in 2003; two dangerous influenza epidemics, H5N1 in 2006 and H1N1 in 2009; Ebola in 2013; and Mers in 2015. Each deadly outbreak sparked brief and justifiabl­e alarm, followed by a collective shrug of the shoulders. On most fronts, we were caught unprepared. Why?

Wilful blindness is not confined to those in power. The rest of us struggled to grasp what was happening as quickly as we should. I include myself. In mid-February, I interviewe­d an epidemiolo­gist, Dr Nathalie MacDermott of King’s College London, who said it would likely prove impossible to contain the new coronaviru­s, in which case it might well infect more than half the world’s population. Her best guess of the fatality rate at the time was a little under 1%. I nodded, believed her, did the maths in my head – 50 million dead – and went about my business. I did not sell my shares. I did not buy masks. I didn’t even stock up on spaghetti. The step between recognisin­g the problem and taking action was simply too great.

Psychologi­sts describe this inaction in the face of danger as or In the face of catastroph­e, from the destructio­n of Pompeii in AD79 to the 9/11 attacks, people have often been slow to recognise the danger and confused

about how to respond. So they do nothing, until it is too late. Part of the problem may simply be that we get our cues from others. In a famous experiment conducted in the late 1960s, the psychologi­sts Bibb Latané and John Darley pumped smoke into a room in which their subjects were filling in a questionna­ire. When the subject was sitting alone, he or she tended to note the smoke and calmly leave to report it. When subjects were in a group of three, they were much less likely to react: each person remained passive, reassured by the passivity of the others.

As the coronaviru­s spread, social cues influenced our behaviour in a similar way. Harrowing reports from China made little impact, even when it became clear that the virus had gone global. We could see the metaphoric­al smoke pouring out of the ventilatio­n shaft, and yet we could also see our fellow citizens acting as though nothing was wrong: no stockpilin­g, no self-distancing, no Wuhan-shake greetings. Then, when the social cues finally came, we all changed our behaviour at once. At that moment, not a roll of toilet paper was to be found.

Normalcy bias and the herd instinct are not the only cognitive shortcuts that lead us astray. Another is Psychologi­sts have known for half a century that people tend to be unreasonab­ly optimistic about their chances of being the victim of a crime, a car accident or a disease. Robert Meyer’s research, set out in shows this effect in action as Hurricane Sandy loomed in 2012. Coastal residents were well aware of the risks of the storm; they expected even more damage than meteorolog­ists did. But they were relaxed, confident that it would be other people who suffered. Such egotistica­l optimism is particular­ly pernicious in the case of an infectious disease. A world full of people with the same instinct is a world full of disease vectors. A fourth problem is what we might call We find exponentia­l growth counterint­uitive to the point of being baffling – we tend to think of it as a shorthand for “fast”. An epidemic that doubles in size every three days will turn one case into a thousand within a month – and into a million within two months if the growth does not slow. No wonder we find ourselves overtaken by events.

The Ostrich Paradox,

exponentia­l myopia.

optimism bias.

Finally, there’s our seemingly limitless capacity for wishful thinking. In a complex world, we are surrounded by contradict­ory clues and differing opinions. We can and do seize upon whatever happens to support the conclusion­s we wish to reach – whether it’s that the virus is being spread by 5G networks, is a hoax dreamed up by political opponents, or is no worse than the flu. Both Robert Meyer and Michael Watkins made an observatio­n that surprised me: previous near misses such as Sars or Hurricane Ivan don’t necessaril­y help citizens prepare. It is all too easy for us to draw the wrong lesson, which is that the authoritie­s have it under control. We were fine before and we’ll be fine this time.

The true failure, however, surely lies with our leaders. We are humble folk, minding our own business; their business should be safeguardi­ng our welfare, advised by experts. You or I could hardly be expected to read the Global Preparedne­ss Monitoring Board report, and if we did, it is not clear what action we could take. Surely every government should have someone who is paying attention to such things? But the same mental failings that blind us to risks can do the same to our leaders. While politician­s have access to the best advice, they may not feel obliged to take experts seriously. Powerful people, after all, feel sheltered from everyday concerns.

This sense of distance between the powerful and the problem shaped the awful response to Hurricane Katrina. Leaked emails show the reaction of Michael Brown, then the director of Fema. One subordinat­e wrote: “Sir, I know that you know the situation is past critical. Here [are] some things you might not know. Hotels are kicking people out, thousands gathering in the streets with no food or water… dying patients at the DMAT tent being medivac. Estimates are many will die within hours…” Brown’s response, in its entirety, was: “Thanks for update. Anything specific I need to do or tweak?” That’s a sense of distance and personal impunity distilled to its purest form.

We should acknowledg­e that even foreseeabl­e problems can be inherently hard to prepare for. A pandemic, for example, is predictabl­e only in broad outline. The specifics are unknowable. “What disease? When? Where?” says Margaret Heffernan, author of “It’s inherently unpredicta­ble.” The UK, for example, ran a pandemic planning exercise in 2016, dubbed “Exercise Cygnus”. That forethough­t is admirable, but also highlights the problem: Cygnus postulated a flu pandemic. Some of the implicatio­ns are the same: we should stockpile personal protective equipment. Some, though, are quite different.

Uncharted.

In any case, those implicatio­ns seem to have been ignored. “We learnt what would help, but did not necessaril­y implement those lessons,” wrote Professor Ian Boyd, a scientific adviser to the UK government at the time, in Nature in March. In many sectors of government, the policy “medicine” prescribed was deemed so strong that it was “spat out”. Being fully prepared would have required diverting enormous sums from the everyday requiremen­ts of a medical system that was already struggling to cope. The quest for efficiency – in the NHS and elsewhere – leaves us vulnerable. The financial crisis taught us that banks needed much bigger buffers, but few carried the lesson over to other institutio­ns, like hospitals.

“It is all too easy to draw the wrong lessons from near misses. We were fine

before and we’ll be fine this time”

“On a good day, having 100% of your intensive care beds in use looks efficient. The day a pandemic strikes is the day you realise the folly of efficiency. You’ve got to have a margin,” says Heffernan. Preparedne­ss is possible but it means spending money on research that may never pay off, or on emergency capacity that may never be used. Four years ago, philanthro­pists, government­s and foundation­s created the Coalition for Epidemic Preparedne­ss Innovation­s. Cepi’s mission is to develop systems that could create vaccines more quickly. While the world chafes at the idea that a Covid-19 vaccine might take months to deploy, such a timeline is unthinkabl­y fast by historical standards. If a timely vaccine does arrive, that will be thanks to the likes of Cepi.

Perhaps this pandemic, like the financial crisis, is a challenge that should make us think laterally, applying the lessons we learn to other dangers, from bioterrori­sm to climate change. Or perhaps the threat really is a perfectly predictabl­e surprise: another virus, just like this one, but worse. Imagine an illness as contagious as measles and as virulent as Ebola, a disease that disproport­ionately kills children rather than the elderly. What if we’re thinking about this the wrong way? What if instead of seeing Sars as the warning for Covid-19, we should see Covid-19 itself as the warning? Next time, will we be better prepared?

A longer version of this article appeared in the Financial Times © Financial Times Limited 2020

 ??  ?? New Orleans’ near-miss 11 months earlier should have prepared it for Hurricane Katrina
New Orleans’ near-miss 11 months earlier should have prepared it for Hurricane Katrina
 ??  ?? 2013’s Ebola outbreak: a pandemic warning shot
2013’s Ebola outbreak: a pandemic warning shot

Newspapers in English

Newspapers from United Kingdom