>> Saturday, June 5, 2010
Aron asked: Why on earth who BP, would anyone choose to not implement a safety feature if it is reasonable to put it in?
Ah, Aron, why indeed. Except people do. Time after time after time after time. BP did so egregiously, obviously, and it serves as an example. But there are untold numbers of examples daily. Ground tests are skipped for new designs ("passed by similarity"), flight tests are cut, second and third tier equipment is purchased for critical functions because we can't wait or because they're too expensive. We have little beeps in our cars to let us know we've left the keys in or the lights on, but we don't have anything to tell us we've left a child in the backseat (though it would be child's play to do) - as if a dead battery outranks a dead child.
Why do we do this?
Well, there are a number of reasons.
(a) Money now vs. risk later. This year's budget is tight and we gamble that the piper will never have to be paid. Statistically speaking, there are few catastrophic failures so you gamble this won't be one of them, even the steps to prevent it are inexpensive and simple. That drop of prevention, while nothing compared to the profits BP (and the others involved) make, is one more item on the bottom line that someone, somewhere is willing to forgo because they think the risk is minimal. Note that "minimal risk" is usually focused on "likelihood" - what are the chances something bad will happen, rather than the extents of the horror, the realistic understanding of what could happen if you have catastrophic failure. There are some good things about probabilistic risk assessment (quantitative estimates of risk) - it's great for discovering what risks require mitigation - but too often it becomes a justification for doing nothing even if the fix for a problem is readily available and/or inexpensive.
(b) Doing the bare minimum. This is why self-regulation has a horrible track record (see recent financial fiascos). Too often people believe that all they are responsible for is to do the barest minimum of the requirements, no more. Even if doing the right thing is only an inconsequent amount more effort, that's not in their contract. If no one requires it, they won't do it. This is especially true if no one's looking. Instead, they'll try to do as little as possible (sometimes not even doing the minimum) and fight against standards and regulations that require things to be done properly, because what they save is money in their pockets unless something bad happens.
(c) Somebody else's problem. Passing the buck. No one has responsibility for the whole thing. The people paying for set up or safety equipment, they're not the ones who are going to do clean up or damage control. Most people are focused on their own bottom line, get done the bare minimum to past muster and pass it on, hoping it fails on someone else's watch. People juggling a test and verification budget might not be the people responsible for the flight or the implementation.
(d) It can't happen to me syndrome. Same things that keep people from buying "baby in the car seat" alarms. We're too diligent, too experienced, too expert to fail. We're using "proven hardware" or are so expert, we give ourselves credit we haven't demonstrated in testing and forgo safeguards or oversight designed to catch unexpected failures. Failures that fail to manifest (even though they can change the world as this one appears to be doing) quickly are seen as "noncredible" or "impossible". Frequently, safety precautions (inspections, tests, fail-safe equipment, etc) that were once deemed required are soon seen as a needless luxury, so much unnecessary flotsom that can be disposed of without a qualm.
(e) Someone else will pick up the tab if the worst happens. This does happen. There are actually caps oil companies have on what they'll pay for this kind of disaster (though it seems BP is going to ignore it and keep paying) and insurance and government intervention, yadda yadda yadda. What it really is is an excuse to sidestep the responsibility. And it ignores the public relations impact on your company, your business, your government endeavor. I've pointed out to NASA that failure means more than the billions lost on equipment and personnel - the PR impact is incalculable. I have to say that the personnel should be enough, but I'm like that.
So, what does all that mean? Not a damn thing. No excuses are enough. Not for BP, not for NASA, not for the financial hotshots, not for any of the other organizations when they have the capacity to do things right and choose not to. When the results are unacceptable, no failure should be tolerated. If you can do something to prevent it, even if it's overkill, you should. If you have a solution for a problem, a risk that can be eliminated, you should make it so.
For want of a nail. . . I can't begin to estimate the costs, the heartbreak, the embarrassment that's gone on through the ages with people who could readily have prevented tragedy, only to fail spectacularly because they didn't do the best they could.
It shouldn't be acceptable.
In my opinion.