40 C
Dubai
Monday, May 5, 2025

Investigative Bias | Air Transport Safety Articles by Shem Malmquist

Must read

[ad_1]

The concept of biases on the part of the accident investigator is not something often considered, however, I think this is a much larger issue than is often given credit for.  It is virtually impossible to separate our bias’from what we do.  Accident investigation is supposed to be a scientific process, but the truth is that few investigators have really been trained to use a true scientific process.  The concept is generally that it is easier to train a pilot, engineer or human factors expert to investigate than the other way round.

Even those who are trained do attempt to follow a scientific process are not often successful.  Why is that?  Well, as important as it is to the scientific process, most people, even including scientific researchers, are not elated if they discover that they’re wrong about their beliefs, and that includes some of the most esteemed researchers in history.  Richard Feynman realized that he had fallen into such a trap at one point, and, once aware, broke out of it.  Lawrence Krauss state that “ …the most exciting thing about being a scientist is not knowing and being wrong. Because that means there is a lot left to learn.” (Krauss, 2011).  Is Krauss really capable of doing that?  If so, he is quite unique, even among scientists.  Talking the talk is a lot easier than walking the walk.

We are all subject to these biases, and it is far from clear that it is possible to isolate any activity such that it is not subject to them.  Your entire global approach to reality is framed by bias, which is really a form of mental short cut that allows us to make faster decisions.  Those decisions were generally pretty good on the Savannah, but they do not always serve us well in modern life.   In fact, many of the current political debates have been inaccurately framed as a consequence of confirmation bias.  As outlined by Tversky and Kahneman (1974), confirmation bias will result in people excluding information that does not conform to their own mental models.  Quantitative research is not immune either.  Jeng (2006) described this as a factor even in physics – which is a field that is about as objective as one can get short of pure mathematics.

These bias’ extend to statistical analysis.  In recent years it has become more apparent that there are some real issues in the ways we analyze results, in that our statistical methodology itself may be flawed (Gill, 2008).  Traditional methods tend to reject results that show high variance even if there is an actual effect (Fenton & Neil, 2012, pp. 311-315).  They also reward low variance even if an effect might not be actually present.  This creates a confound in determining the outcome that is not generally considered in the research.  For example, is the lack of statistical support for the need to have simulator motion to support training a result of the motion really not being a factor, or is it a result of some high variance in results, which prevents the rejection of the null hypothesis (Bürki-Cohen, Go, Chung, and Schroeder, 2004)?  Without further analysis, it is impossible to say, as it is equally possible it is a consequence of flaws in the motion algorithms themselves or simulator limitations.   Or perhaps, there really is no statistically significant effect? (see also http://bayesianrisk.com/).

These factors have profound implications as to how we do our job.  What can be done to mitigate these issues?  That is the subject of a future article.

References:

Bürki-Cohen, J., Go, T.H., Chung, W.W., and Schroeder, J.A. (2004). Simulator Platform Motion Requirements for Recurrent Airline Pilot Training and Evaluation, Final Report. Cambridge:  Human Factors Research and Engineering Division, John A. Volpe National Transportation Center.

Fenton, N., & Neil, M. (2012). Risk assessment and decision analysis with bayesian networks. Boca Raton: CRC Press.

Gill, J. (2008). Bayesian Methods: A Social and Behavior Sciences Approach, 2nd Edition. Boca Raton: CRC Press

Jeng, M. (2006). A selected history of expectation bias in physics. American journal of physics74, 578

Krauss, L. (2011).  Cosmic Connections. Retrieved from http://www.youtube.com/watch?v=FjAqcV_w3mc

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science185(4157), 1124-1131.

===========================================

aoa-book-image1

 

About Shem Malmquist FRAeS

B-777 Captain. Air Safety and Accident Investigator. Previous experience includes Flight Operations management, Assistant Chief Pilot. Line Check Airman, ALPA Aircraft Technical and Engineering Chairman, Aircraft Performance and Designs Committee MEC Chair, Charting and Instrument Procedures Committee, Group Leader-Commercial Aviation Safety Team-Joint Safety Implementation Team (CAST)-Loss of Control-Human Factors and Automation, CAST-JSIT- Aircraft State Awareness. Fellow of the Royal Aeronautical Society, full Member of ISASI, AIAA, IEEE, HFES, FSF, AFA and the Resilience Engineering Association. I am available for consulting, speaking or providing training seminars to your organization. Please contact me at https://malmquistsafety.com/for inquiries.

[ad_2]

Source link

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article