Thank you everyone for sharing your experiences and thoughts.
IRstuff
You touch on two topics that I feel are very critical: 1) the operator can override the shutdown system, and 2) maintainance bypass / operator override etc.
These two topics, in the subset of safety incidents that I have seen (I say subset because of course, we did not look at all incidents in history) have caused more than 50% of the accidents in my industry (petroleum, oil, gas).
The operator, during an upset, is pretty busy and stressed. Not the best climate for making detailed complex decisions. Often, a sub-optimal decision is made - sometimes leading to accidents.
Putting thins in bypass/overrided/forced values/etc. is also a common practice. Unfortunately, this sometimes lead to a condition becuase the information that is missing is sometimes needed by others, or other users are not informed of the bypass/forced value.
I no longer work in a plant, and unfortunately , do not have access to data to substantiate these thoughts.
Has anyone else come across similar data or experiences?
GTstartup
You are advocating something that for my industry, is quite new - that the safety system (computer) can/should override the human operator (or lock them out). Many of the safety experts that I worked with on my last job are advocating this approach. In time of stress, the opertor often does not make the correct decisions, hence, the safety system will do it for him.
The last power plant project that I worked on (2 gas fired GE generators and a Dresser Rand steam generator) I believe followed this approach. I didn't realise that this was prevalent in the industry.
waross
Bad data is the bane of all control/safety systems. How we deal with it, is more of what I am looking for.
In your example, your control system/PLC gives a timed window to address a failing measurement (in this case, a high stack temperature warning for several weeks).
In some of the accidents, the problem was that operatorations could not discern between a failing instrument giving misleading readings vs. a true high temperature. The result was that the reading was ignored, leading to a failure/accident.
One of the key take aways for me from working with the safety experts is: People must have confidence in the safety system. Otherwise, the safety system will be useless. Operations will bypass the safety system if they do not trust it, resulting in potentially a greater problem than if the safety system was removed altogether.
All your posts are appreciated.
This is a complex and passionate topic for many in the industry.
I certainly welcome and invite more comments, thoughts, questions...
"Do not worry about your problems with mathematics, I assure you mine are far greater."
Albert Einstein
Have you read FAQ731-376 to make the best use of Eng-Tips Forums?