Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here



I've had some free time to write about something that I've been interested in for a while. I wrote it as a FAQ since some of my research into these questions may be links of wider interest.

Some questions I wanted to ask:
How do different industries investigate accidents when they do occur?
How do engineers in different industries learn lessons from accidents?
Does legal action after an accident lead to improved engineering?
Are investigations held by governments, courts, or industry more effective?

The FAQ is written as a work-in-progress. Suggestions/criticism are very welcome.
The only topics I have written about are ones I have seen or considered in my professional career, except for civil engineering; not my field at all.
The main reason that I brought up failures in civil engineering is ignorance of any consolidated accident databases to refer to... Where is the big picture?
If you ask me of typical causes of aviation accidents, I know exactly where to look, what to look for, how to refine the data...
If you ask me of typical causes of building collapses... blank... civil/structural guys+gals... help?


Replies continue below

Recommended for you


The results of investigations into structure failures are often kept confidential by the legal process, thus impeding the furtherance of engineering knowledge, which is a shame. There are exceptions, but there is to my knowledge no well organised data base, despite efforts in that direction.


Many high-profile civil/structural disasters (both natural and human-caused) are covered in the weekly magazine "Engineering News-Record" (ENR). As news, investigations, and other developments occur the magazine publishes updates. This can go on for months. The magazine is well respected in the Engineering and Construction communities; it has been published since 1917.

From time to time, ENR, will publish books which are compendiums of there magazine articles. These present an interesting outline of how and when information on a disaster became available. I have one of their books, "Construction Disasters" (1984). Subjects include the Tacoma Narrows Bridge (1940), Kemper Arena Roof Failure (1979), and Hyatt Walkway Collapse (1981).

www.SlideRuleEra.net idea
www.VacuumTubeEra.net r2d2


For the US, the NTSB maintains a comprehensive data base for transportation accidents/failure reports, not just in aviation, but other modes of transport as well. The I35W bridge collapse in Minnesota comes to mind.


US Chemical Safety Board.

Personally, legal action seems to inhibit learning rather than encourage it. CSB or Marine/Air Accident Investigation Branches reports for example, looking for root causes and preventative actions for whole industries rather than legal accountability for a specific incident.

I rather like parallel investigations - one with all the legal safeguards of rules of evidence and the letter of the law for liability, a separate one to look at how to stop it happening again.



Most Codes, Standards , Regulations etc have their genesis in accidents, incidents or similar. So although the process may be slow and less than perfect, the engineering community as a whole get the benefit of the lessons learn't (often the hard way) from the investigation of accidents and failures etc, by regulators, law enforcement, coroners, media and lawyers.

The aviation industry seems to do this relatively well with quite clear outcomes from accident investigations often being implemented on the ground reasonably quickly but other industries are often slower/harder to change.

"Any water can be made potable if you filter it through enough money"


Thank you everyone for your suggestions!

I have added many of your ideas to the FAQ.
SlideRuleEra: your reply is posted almost verbatim.
I think I'll be reading the ENR a fair bit in the near future, too.



As far as investigations and learning, I like the five whys approach. Like a kid asking why the sky is blue, keep asking why until you get to the root cause(s). Many of the investigations I haven't been involved in left me with more questions than answers. For example:
Conclusion: Insufficient training was a contributing factor.
Unanswered question: Why was training insufficient? Budget? Resources? Lack of support from upper management?


In your unanswered question you say " why was training insufficient ? Budget ? Resources ? Lack of support from upper management ?
Have you also considered . Pressure from upper management to get a job done ?
An example of that would be the Challenger accident.

You are judged not by what you know, but by what you can do.


The small-child-style digging until you get to the root causes is really important. Equally, I think it is also vital to look sideways and see what can be learnt from things that weren't directly causal. For instance:

What was the bigger accident that didn't happen this time? Do we actually need to do something to prevent that? The Cullen report into the 1999 Ladbroke Grove Rail collision identified eight previous closely-related near-misses - each of which had been properly investigated in accordance with the rules and one of which Lord Cullen referred to as a "dress rehearsal" for the eventual disaster.

What things worked well and prevented the accident being worse? Are these things already widespread throughout the industry, or is this a good example for promoting their adoption? A long history of MAIB reports - many showing how quickly fishermen who have gone over the side without lifejackets on drown and a few showing how those wearing the things come back - led to an initiative where UK commercial fishermen were given free lifejackets.

What bits of pure luck prevented the accident being worse? Ought we make to sure they happen on purpose next time? The presence of some US Air Force medical personnel on one of the trains involved in the Harrow and Wealdstone accident in London in 1952 and the on-site triage and stabilisation work carried out by the colleagues they subsequently called in to help didn't pass unnoticed and played a big role in shaping emergency medical response in the UK in the years that followed.

What bits were actually a complete shambles, but just pulled together far enough to get the job done? (This one is especially sensitive where it touches on rescue efforts, since it has to be critical of those who are often also the heroes of the occasion). Recognition of the utter lack of coordination among rescuers at Harrow and Wealdstone was a spur to proper Emergency Planning - the fruits of may be seen in the Cullen Report - even if there were more lessons to learn (by the time of Cullen, the management of incident response was a standard part of the investigation - something notably absent from the formal Harrow report)

Has anything similar happened before? Are we comfortable with that? Is something changing? Have we underestimated the scale of an issue? One of the advantages of having permanently established investigation teams is their ability to investigate developing trends whether at sea or ashore

How hard was it to do the investigation? Did we know who was meant to be doing the investigating? Does that mean that there's a hole in the regulatory system - a place where nobody is responsible for protecting the safety of workers, the public or the environment? Was it hard to work out who to talk to? Does this mean that there's a load of orphan equipment/structures/software in use with nobody taking care of them?

Did everybody cooperate freely? If not, why not? Have we got the "Just Culture" bit right yet? I'm starting to see reports which list all the identified stakeholders, stating explicitly which of them did - and didn't - cooperate fully with the investigation.



This has long been known as the Socratic Method:


Socrates engaged in questioning of his students in an unending search for truth. He sought
to get to the foundations of his students' and colleagues' views by asking continual questions
until a contradiction was exposed, thus proving the fallacy of the initial assumption. This became
known as the Socratic Method, and may be Socrates' most enduring contribution to philosophy.

It doesn't have to be about philosophical ideas. It can be used to penetrate into underlying causes and motivations, and has been used in this way for a very long time.
At some point, philosophy became a "useless" subject in all university education, except for philosophy and law.




Whenever I hear the phrase "Cullen Report" the one which comes to mind is the Public Inquiry in to the Piper Alpha disaster, perhaps because of the industries I've worked in. The official report into Piper Alpha is well worth a read - over one hundred separate recommendations came out of the inquiry and I believe every one of them was adopted. The film "Fire in the night" is also very thought-provoking.


I'd forgotten how many times he drew the short straw. The school shooting (fortunately a very exceptional occurrence in the UK) at Dunblane gave rise to another Cullen Report.



Yeah, I only found that out myself earlier today - I can remember the Dunblane incident clearly, but didn't know Lord Cullen led the inquiry. Thanks for the link.


SparWeb, shout if you think this is drifting too far off the intent of the thread, but....

One particularly tricky area comes where there's some reason to suspect negligent or reckless behaviour on the part of people who died in an accident, but insufficient evidence to prove it. How far do you blacken the name of those who are in no position to defend their actions? At opposite ends of the scale here are two cases:

In the UK, military accident reports have much more of a judicial element about them than most of the investigation regimes we've discussed so far: There's a clear expectation that if there's blame to be found, the Board will call it out and there's defined rules of evidence governing how that may be done. Once the investigation is complete and the report written, there are two layers of review where officers more senior than the original Board of Inquiry each comment on the Board's findings, having had the opportunity to review both the report and the entire bundle of supporting evidence. In practice, at each of these two levels the whole lot is tossed in the direction of somebody in the Outer Office who has an interesting few days trying to work out from the pile of statements and evidence whether there's a complete picture to be had, and whether that matches the conclusions drawn by the Board before drafting a review statement for the Senior Officer. What is eventually published is the original report, with both reviewers' comments appended to the back.

The 1994 Crash of Chinook ZD576 into the Mull of Kintyre was one where the aircraft was flown into rising terrain by a crew who would normally have been perfectly capable of flying round it. There was no evidence to show why this should have happened and, after the investigators had written an inconclusive report, the Senior Reviewing Officers introduced a finding of Gross Negligence by the aircrew - one that turned out to be highly controversial.

Looking at a more recent accident where all the aircrew died, the 2013 crash of Police helicopter G-SPAO onto a pub in Glasgow was investigated by the AAIB. They identified that both engines had flamed out over a built-up area after the Supply Tank had been allowed to become empty of fuel. There was fuel left in the Main Tank, but both Transfer Pumps had inexplicably been switched off. Although there was ample evidence of a number of warnings having operated, there was none of any effective emergency procedures having been carried out (although both Prime Pump switches, just to the left of the Transfer Pump switches were found to be anomalously switched on). At that point, after noting that individual Transfer Pumps sometimes needed to be switched off temporarily if the helicopter spent a long time orbiting a target and that the Police Observers could - but would normally not - be asked to operate helicopter equipment, the report doesn't attempt to speculate about what might have happened in the cockpit. In complete contrast to the earlier Chinook report, the possibility that management of the fuel system had been abdicated to, and subsequently mishandled by, the Observer (which was my immediate reading-between-the-lines when I saw the report) isn't raised at all. Inevitably, this approach didn't find favour either with a public that likes a clear-cut and complete explanation after a tragedy.



You are bang-on the topic, since the way that findings are made and how they're reported are essential to the lessons that others can learn. For those within the process, such as the executive review following the Chinook accident, or the investigators themselves, there may be opportunities to provide the "minority" opinion, but in other cases there may not - and conflicting opinions among the investigators can lead to inconclusive reports such as the police chopper crash. I've never seen a space for such a minority or contradictory opinion in a civilian accident report, but it seems to me the Senior Officer review in the UK military process you described would permit that.

The G-SPAO AAIB report is a good read, but missing information that I want to see. Yes, another case of allowing in inconclusive finding to remain inconclusive. Notably absent from the report is a photograph of the switch panel that holds all of the subject switches. For the reader to evaluate the likelihood that the two Prime switches were flipped rather than the Transfer switches, a photograph would provide essential information, rather than the ambiguous statement "...The fuel transfer pump switches are, however, in close proximity to the prime pump switches, so unintentional selection of the inappropriate switches was possible..." A quick google search pulls up a photo of the overhead switch panel. I don't think anyone would mix up the two pairs of switches, looking straight at them and paying attention, even in a darkened cockpit, but what about a distracted second officer, whose attention is actually on the FLIR camera...?

I also have mixed feelings when these reports quote statistics about the rarity of any such events from happening. Clearly the manufacturer of the helicopter wanted that data to be included, and the investigators deemed it to be relevant enough to be included, but it doesn't help anyone figure out why THIS flight had the switches wrong. The finding of pilot error is based on his failure to attempt an autorotation to arrest the descent of the helicopter, and this is sufficiently grave error without regard to the switch status.

Something else to bear in mind: the actions of emergency responders at the scene of an accident. Depending on their training and experience, and the rarity of events like aircraft accidents in urban scenes, these people may or may not take action to secure the scene, prevent subsequent fire, etc. Battery cables may be cut, fuel valves are shut, extracting victims from the wreck bumps into many other things. I would not put it past a constable who arrives at the scene to have some knowledge of his city's police helicopter, and some idea where the fuel switches are to shut them OFF before chopping the battery cables. It is post-accident changes to the wreckage such as these that make the investigators distrustful of things like switch positions, and they seek other ways to determine the status of any relevant system independent of the position of the switch they find in the wreck. In Canada, the air regulations state that, following an accident, the wreckage shall not be disturbed beyond what it necessary to rescue victims and secure the site from further damage. There is probably similar law in other countries.

Back to your first point ZF, the negligence factor. There are a lot of old stories, such as the one about the pilot who just landed with the gear still up... the first thing he does is move the switch to the "down" position... and other silly things like that. One power that these investigation boards do NOT have, and it's critical to getting survivors to cooperate, is any judicial or punitive powers. How can you expect anyone to tell the truth, if they think they could be held at fault, legally? Unfortunately, cooperating with the TSB (in Canada) does not give you any immunity from the courts, later.

Thank you again for the stimulating ideas!



Scotty, that's a great link, and I'll include it in the FAQ.
There's a thorough report on the Buncefield accident in one of the free PDF issues. Lots of lessons to learn there!



Glad it is of use. I have searched and searched for a copy of the Piper Alpha report but frustratingly it seems to be hard copy only. It's a damning critique of an entire industry as it was at the time, and it was the biggest game-changer in UK safety legislation since the Robens Report which resulted in the Health & Safety at Work Act 1974.


Not sure this is entirely on topic - different codes on the same physical thing in different industries.

In Germany, there's a rule for biogas plants that condensate pits may not have ladders. Condensate pits for sewage gas (same thing, different source) have no such rule. Condensate can exude gas (CO2, H2S) which can kill or incapacite one quickly.

This is less stupid than it sounds: Most biogas plants are on farms, operated by farmers. Who theoretically know all about the dangers of pits and shafts (silos, manure pits are also dangerous), but still accidents happen. Possibly because the attitude of many farmers to saftey is a bit cavalier. So the Berufsgenossenschaft (body responsible for work-related health insurance and some safety regs) mandates no stairs/ladders in condensate shafts.

Wasterwater treatment is mostly a municipal afair, the plants have (mostly) enough staff, the staff is better trained and equipped for work in dangerous atmospheres, and maybe the mentality is different.

(Safe procedure is basically to have the person in the shaft with air supply and in harness, chained to a tripod so a second person can pull the first out.)

Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

Red Flag Submitted

Thank you for helping keep Eng-Tips Forums free from inappropriate posts.
The Eng-Tips staff will check this out and take appropriate action.

Reply To This Thread

Posting in the Eng-Tips forums is a member-only feature.

Click Here to join Eng-Tips and talk with other members! Already a Member? Login


Close Box

Join Eng-Tips® Today!

Join your peers on the Internet's largest technical engineering professional community.
It's easy to join and it's free.

Here's Why Members Love Eng-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close