On July 6, Canadians woke up to the news of the derailment and fire at Lac Mégantic, Que. — a tragic accident on such a scale that it was featured in news media around the world.
The facts are not fully known at this stage, but it looks as if the proximate cause was a runaway unmanned train that derailed at the curve in Lac Mégantic’s town centre. Discussion has encompassed related issues such as single-person operation, train braking systems, speed limits, tank car design and the relative risk of rail versus pipeline.
As with the explosion at the West Fertilizer Company in Texas this past April, what goes through my mind in a case like this is not so much the details of the immediate cause but the broader issues: How well did the organizations having control understand the risks they were managing? How sound were the systems they had in place to control those risks?
The first step — understanding the risk — focuses on engineering aspects of hazard control, where the emphasis is identifying hazards and the technical basis of possible failure. Two Chemical Institute of Canada publications are useful. One is the Process Safety Management Guide, which gives an introduction and overview for those new to the scope of what is involved. The other is the Process Safety Management Standard, which provides a more comprehensive description, allowing those who are more advanced to audit their performance.
The second step — the soundness of the management systems — includes not only how the systems are designed to work but also how they work in actual practice, which may be quite different. The focus here includes sociological aspects such as individual and organizational behaviour. It includes, for example, the idea that people at all levels in an organization are driven by a variety of motives as they attempt to balance conflicting individual and group priorities. It also considers that they are not perfect, but will make errors from time to time, some of which may be noticeable while others — often management decisions — set the stage for something to go wrong only if a certain set of conditions occurs at some time in the future. The goal is to anticipate how the systems can fail and to incorporate ways of detecting failure that initiates corrective action before loss of control leads to disaster.
I know that some will be thinking of these broader implications as they digest the lessons of Lac Mégantic and actively look for vulnerabilities in their existing systems. However, based on past experience, I suspect that there will be some who think, “Lac Mégantic was a rail incident and we’re not running a railway so it doesn’t apply to us.” There will probably be others whose main concern will be to assure management that, in their case, everything is under control and the existing systems are fine. Yet this is where such concepts as James Reason’s “Swiss cheese” model and “normalization of deviance” come into play (everyone involved in safety should be familiar with these).
The scale of the tragedy at Lac Mégantic — 47 lives lost and destruction of the heart of the town — will be felt by those in the town for many years to come. Regardless of the outcome of the investigation, there will be many lessons that emerge. Some of these will be readily apparent but others are likely to need serious dialogue between the parties — shippers, carriers, communities, regulators and others — as society feels its way toward a balanced response. The wake-up call from Lac Mégantic should drive the key decision-makers to greater understanding and control of their operations rather than hunkering down and waiting for a return to business as usual.
Graham Creedy, FCIC, is a professional engineer in Ontario and a member of the CIC Process Safety Management Division. He teaches management of safety and health risk part-time at the University of Ottawa.