Slices and holes

Last year was the 40th anniversary of the Union Carbide Bhopal disaster which caused the deaths of an estimated 16 000 people over the weeks that followed, and injuries to half a million people. Like many major disasters this was the result of many failures, both large and small.
As part of last year’s February OECD Chemicals and Biotechnology Committee, Professor Fiona McLeod gave an enlightening presentation on the causes of the Bhopal disaster and its impacts and ongoing consequences. It is clear that a number of factors led to the extent and impact of the disaster – engineering failures, the degradation of equipment that was end of life, the fact the factory was soon to be closed leading to lowered maintenance standards, staff dismissing instrument readings, political unrest leading to delays in processing chemicals, the fact that people were living right up against the factory, poor community information about what to do in a disaster and on and on. While it was water entering the pesticide tank that created the leak, the disaster was caused by a whole range of factors coming together.
Similarly, the Chernobyl nuclear disaster was the result of a range of issues that came together to cause catastrophe. A last minute change in timing to a process, problems with reactor configuration, choices made by the control room, the use of bitumen on the roof contrary to safety regulations, lack of information provision to first responders… Again we see a situation in which a multiple number of factors make an accident an enormous disaster. Yes, a nuclear reactor meltdown is always going to be catastrophic but the why of it, and how much worse than the minimum it could be, depend on a whole number of smaller factors.
The significant level of mortality on board the Awassi Express as was highlighted in a 60 Minutes episode was similarly the result of a range of factors and failures. There was a high stocking density (we later reduced the level for sheep shipments), it was likely wool length was longer than it should have been, there were issues with ship ventilation and then the ship got stuck outside port in extremely hot weather. The scale of the mortality was the result of a combination of factors which made what could have been bad, much much worse.
The phenomenon which led to these incidents being as bad as they were is commonly known in risk analysis as the “Swiss Cheese” model, first articulated by James T Reason. The idea is, essentially, that layers of risk management can be seen as being like Swiss cheese slices – mostly effective but with some holes. By using a number of “slices” the “holes” effectively get covered, resulting in protection and lower risk. When major disasters occur, it is because these some of these slices have been removed and the holes have lined up leaving it clear for a disaster to make its way through the centre. There is a lot more to Reason’s model, but this is the essential idea.
What does this mean for us as regulators? First, I think it is critical that we understand how each part of our regulations – or slices - contribute to the prevention of harm. What are the gaps – the holes – that each leaves? How do we ensure that the holes can’t line up in the disastrous way again? In the case of live sheep exports, the view was taken that a hole that could not be covered easily was the heat of summer in the Middle East and so the slice that was created to cover it was the introduction of a ban on exports during the Northern Summer period. Covering the hole where the greatest risk lies made the other issues, while still important, less consequential. Managing these kinds of regulatory systems is about understanding and dealing with risk. If we, as regulators, don’t understand the risk we are presented with, we will struggle to make sensible decisions about where discretion is possible. If regulatory policy makers don’t understand it, they won’t know how to craft the cheese. Regulatory systems are not perfect, but we need to ensure our cheese has as few holes as possible.