Here is an abstract from the CIM conference in May 2011. Beware that this is a baffling abstract—hardly clear what the point is, but somehow I get the impression there is a new idea there. If only the authors were forced to prepare papers instead of being let off the hook with some pusillanimous abstract. This abstract proves the stupidity of lazy organizations and presenters in wasting their time and ours with half-baked abstracts. The CIM and hence the Canadian mining industry should be ashamed of itself in cutting trees to prepare paper volumes with such trivial stuff.
Nevertheless, here is the abstract. If only we could get the full paper and fully judge its merits. (I edit in what is probably a vain attempt to improve readability and comprehension.)
Organizations are like the Titanic: our delusions let us enter dangerous waters at full speed, ignoring obvious signs of impending catastrophe. This paper analyzes seven delusions in disaster-prone organizations such as BP texas, Bhopal, Piper Alpha — and possibly yours. The delusions are: risk control; compliance; consistency; human error; predictability; statistical trend; and invulnerability.
The seven deadly delusions are not readily visible in an organization, mostly because there are no mechanisms from inside to identify them. The problem is that role players such as managers, supervisors, and employees all have a vested interest in positive outcomes, positive trends, and good news about safe work performance.
The massive focus on positive safety in today’s business world, and the lack of self-critical analysis and understanding of the complex dynamics in safety, all contribute to the problem. Add to this mix the peddling of simplistic safety solutions, such as the concept that a human being can be reduced to a simple ABC (antecedent-behaviour-consequence) model such as is espoused by behavioural psychology and safety and you have an explosive concoction.
See what I mean: this rambling blather does seem to contain the seeds of an idea. Maybe we will never see the final paper and will never be able to judge objectively and decide on the basis of insight.
Yet this very week I was pulled into a risk assessment session with a client. It was an embarrassing mess. The facilitator did not understand the new system imposed from head-office. The participants had less clue as to what was expected. So we spent an hour or two blathering about the risk that we would not get the permits required to move forward.
I tried in vain to persuade those present that upper management was more interested in this question: are the risks of the project that we have just approved, so great that maybe we should not have approved the project, or at the least should adopt a different path forward from the one our consultants recommend?
Nobody could grasp my point. Maybe I was inarticulate. Maybe they did not want to face the issue. Regardless, we did not do a risk assessment of the project, however you phrase it. But who cares? There is a filled-in matrix and table and all the fancy charts the process demands. Nobody will question the outcome. It is not in their best interests to do so.
PS. The abstract lists C. Pitzer of SAFEmap International as the author. Maybe you can seek him out and ask him to send both of us his full paper.