Last month’s column used the BP oil spill as an example of the risk associated with a “good enough” standard. Revolutionizing your corporate culture to embrace a “never fail” standard may seem unnecessarily expensive, but the BP spill tragically reminds us that a tolerance for failures, even small ones, can lead to disaster.

Robert Bea, an engineering professor at the University of California, Berkeley, and co-leader of a scientific team that investigated levee failures in the wake of Hurricane Katrina (five years ago this month), has documented events leading up to the April 20 explosion on the Deepwater Horizon rig. Here are a few of his findings:

• Although “kicks” of natural gas are common, several had stalled work in the weeks before, and one intense kick had caused a shutdown due to fear of an explosion.
• Shortly before the explosion and after a debate among the engineers, it was decided to replace the heavier filling mud that acted as a defense against kicks with lighter weight seawater.
• A year earlier, BP had labeled intermittent problems with pockets of natural gas that forced their way up the drill pipes a “negligible risk,” even though the government warned the company to “exercise caution,” as this was a “real concern.”
• Employees repeatedly warned supervisors of impending problems but were overridden by executives far from the rig. Coincidentally, on the day of the explosion, engineers and officials were on the rig celebrating seven years of perfect safety performance, and a number of them were injured or killed.

A pattern emerges from the investigation of disasters. The rig owner, Transocean, was involved in the decision process, and the contractor, Halliburton, used faster curing cement for barriers intended to keep gas out of the well. Blowout-preventer valves failed to enable shutoff of the well. Minerals Management Service, a federal agency charged with oversight, approved the drilling plans.

Bea draws a parallel between the failure of decision-makers involved with both the oil spill and the Katrina levee failure to consider “residual risk,” defined as the things planners don’t believe will fail. He accuses engineers in both cases of “imagineering,” by considering only part of the potential risk scenario. (For more details of Bea’s findings, see the May 10, 2010, article by David Hammer and Mark Schleifstein in the New Orleans Times-Picayune at

How does this relate to managing an electrical contracting business? Well, it confirms the danger inherent in accepting errors. A pattern of mistakes may lead to a catastrophic loss of life, resources, reputation and possibly your business. Research by behavioral economists shows that people wildly over- or underestimate numbers and margins of error when calculating risks, and social scientists find experts exhibit the same tendency.

In his book, “Catastrophe: Risk and Response,” Richard A. Posner argues that the human brain pays more attention to more recent events and to evidence that seems to confirm the likelihood of a catastrophe. Our failure to understand risk causes us to underprepare for disaster, so we may decide to purchase inadequate insurance coverage or rebuild a destroyed home in the same floodplain.

You can improve your analysis of events that may cause significant loss of company resources or reputation. Matt Evans, a CPA in Arlington, Va., offers a series of free Excel spreadsheets ( Click on No. 30, Risk Analysis, to explore his format for calculating factors related to “outages” or incidents that result in significant lost time or a shutdown of your business.

What else can you do? You can pay attention to the little things. For example, your insurance agent will tell you that several minor injuries will often have a greater impact on your worker’s compensation premiums than one major incident. In other words, a company that tolerates a pattern of injuries at any level is likely to have a major problem down the road. After finding a pattern of minor muscle and back strains, one major contracting company implemented a few minutes of stretching for all employees on the job site. The reduction of minor lost-time injuries made the investment in labor time worthwhile.
Listen to your employees. Decisions at the lower levels of supervision are the most frequent cause of disasters. Train supervisors to be more aware of potential risks, and insist that they listen to workers who raise questions about safety. Avoid the temptation to overrule employees solely on the basis of cost.

Beware of unusual situations. Catastrophes result when rare events or unexpected interactions occur together. Watch for conditions that make you uneasy, especially if they are beyond the control of your company. Most importantly, don’t become complacent, especially near the end of a successful project. Overconfidence sank the Titanic, and it can do the same to your company, unless you pay attention. Remember, only the tip of the iceberg is visible.

NORBERG-JOHNSON is a former subcontractor and past president of two national construction associations. She may be reached at