Retrieved de https://studentshare.org/management/1576510-human-error-in-aviation
https://studentshare.org/management/1576510-human-error-in-aviation.
A Review The author’s concern regarding aviation safety involves the amazing lack of communication skills exhibited by failed cockpit crews. Helmreich cites several accidents in which members of the crew sensed problems, but failed to communicate concerns urgently or properly. As a professor of Psychology at the University of Texas, Austin, since 1966, he has taught interpersonal communications. Helmreich’s study concluded that the crews tend to mitigate any potential problems; however, some situations can lead to disaster, such as:1.
Focus on the wrong problem.2. Failed communications3. Technology and its misuse in the cockpit.The root cause of these incidents is failure of redundant systems. For example, a co-pilot who always cedes to the judgment of the pilot hurts the team because no redundancy checks exist. The author attempts to review these errors which lead to disasters and create solutions to the problems.Focus on the Wrong Problem The article cites the 1978 United Airlines crash where the pilot focused on the landing gear warning light and not the low fuel indicator.
Although the copilot warned of the fuel problem, he was not emphatic enough. The jet crashed, not because of a faulty warning light, but because the two pilots failed to act as an effective team. As a team, they stayed focused on the imagined landing gear problem. (Helmreich 1997)Failed Communications The second citation is the 1982 Air Florida crash in Washington, DC. Again, the copilot was uncomfortable with the aircraft’s performance, in this case air speed indicators; but again, the copilot failed to adequately state his concern.
The jetliner crashed into the Potomac River. This crash pointed to the importance of collaboration rather than chain of command. (Helmreich 1997)Technology and its Misuse in the Cockpit One suggested technological solution is smart computers. When these devices were tried in the flight simulators, oftentimes the pilot would concentrate on programming the computer rather than flying the aircraft in crowded airspace. The better solution was to turn off the computer and fly “stick and rudder” so the attention was on the airspace and other traffic.
(Helmreich 1997)Conclusions Crew Resource Management (CRM) resolves these issues to some extent. CRM dissects the cultural, societal and company, psychological and social human factors in communications and decision making. A more collaborative management ensues from this training. All crew members can speak out on safety issues. One study showed an Asian based airline’s pilots considered chain of command to take precedence over safety. The net result is training teams, rather than leaders and followers.
This same technique applies to emergency room teams, surgical teams and first responders. The main concern using this paradigm is communicating the facts of a situation for a more clear picture.ReferencesHelmreich, Robert L. (1997, May), Managing Human Error in Aviation. Scientific American. 62-67.
Read More