There has been criticism recently of the decision made by emergency services in response to the tragic eruption on Whakaari/White Island in New Zealand. Some local pilots returned to the island to attempt rescue, while emergency services didn’t because of the risk involved. While the local pilots showed undoubted bravery, was the criticism of emergency services warranted?
Consider a more mundane situation – imagine you have a large order to fulfil against a fixed deadline. There are two key roles in your warehouse to meet the order requirements – stock pickers and forklift truck drivers. The stock picker takes the selected products from their storage locations and puts them onto pallets in the appropriate combination for the customer shipments. The forklift truck drivers then take them and load delivery trucks. Fairly straightforward and repeated in thousands of warehouses. Because of the time pressure, the forklift and stock picking supervisors both add an extra person to the task.
Question – is this safe or not?
The answer, of course, is we don’t know and can’t know. There is not enough information. So let’s move forward with two alternative scenarios (or, for any Terry Pratchett fans, bifurcate down different legs in the trousers of time).
Scenario 1 – the increased resource allows the packing and movement to be completed more quickly and the shipment is made ahead of the deadline (but only just). This is deemed a Success and a review identifies the allocation of additional people as a crucial to meet the deadline, for which the supervisors are praised.
Scenario 2 – the extra forklift makes things a bit congested and, in trying to avoid a collision, one steers too sharply and tips slightly, dropping a box onto the extra stock picker who was standing over the edge of the painted walkway while letting the other stock picker past. This is deemed a Failure and a review identifies the allocation of additional people as a causal factor, for which the supervisors are reprimanded.
Aside from being a typical example of safety management hypocrisy, why is this relevant? In quantum mechanics (stay with me here), a particle can exist in more than one state until it is observed1. The observation forces it into one state. Schrodinger’s cat is a thought experiment where a cat is sealed in a box with a lethal trap triggered by a radioactive atom’s decay. When the box is opened, the observation forces the atom into one state or the other and the cat can be seen to be either dead or alive. But, and this is the difficult-to-get-your-head-around part, while the box is closed and not observed the atom exists in both the decayed and undecayed states and so, by extension, the cat is both alive and dead, until we open up.
By analogy, completion of the shipping job can be viewed as opening the box to see the state of the work. Our warehouse can be viewed as both safe and unsafe, both successful and a failure, until the completion of the job and the final observation of the outcome. I call this Schrodinger’s safety.
We only know if it was safe or not when we’ve finished (when we open the box) and cannot know at the time the decision was made. Because every complex decision is both good and bad.
This is, of course, just a fancy way of warning against hindsight bias, but it does help to explain how successes and failures can arise from the same source and that purely focussing on failures is, at best, incomplete and, at worst, damagingly misleading. Perhaps this can also stop people trying to dissect decisions made in complex environments and unreasonably criticising those who had to make them at the time with no way to look inside the box.
1This is over-simplified and there are different theories as to the role of the observer and the particle and how states ‘collapse’ or otherwise change. The cat is intended to show the difficulty in trying to apply quantum thinking at a macro level. Disclaimer – my physics has largely lain still and gathered dust for over 25 years.