Retrieved from https://studentshare.org/miscellaneous/1508397-human-error-by-james-reason
https://studentshare.org/miscellaneous/1508397-human-error-by-james-reason.
Referencing around eighty books, 'Human Error' today is cited itself by more than a hundred works by authors like A. Hale, John R. Wilson and E. J. Lovesay etc. The book draws a schism of three parts where the first portion comprising of the first two chapters inoculate the fundamental ideas, research traditions and brings an account of background studies pertaining to errors. The middle part from chapter three to five elucidates the error mechanisms and the common reasons behind a wide range of errors.
The final section comprising of the remaining chapters throws light on the result of errors that range from their detection, unintentional contributions to them and the curative steps towards them. As clear from the title, the first chapter of the book elucidates the nature of errors, draws a classification of various types of errors and the strategies that lead to their detection. According to Reason the two major types of errors are similarity-matching and frequency-bias. The second chapter of the book traces back the significant studies done in the area human error which Reason classifies into natural science approach and the engineering approach based on the practice of their research.
These are the studies that form the foundation of Reason's Human Error. I Performance Levels and Error Types:In the third chapter, the author introduces a Generic error modelling system that facilitates the identification of three types of error types namely; skill based slips and lapses, rule based mistakes and knowledge based mistakes. The model is rather a combination of ideas on cognitive theories from Rasmussen, Rouse, Anderson, and the Holland, Holyoak, Nisbett, and Thagard. Cognitive Underspecification and Error Forms:"When cognitive operations are underspecified, they tend to default to contextually appropriate, high-frequency responses.
" 2The fourth chapter harps around the topic of cognitive under-specification. The author also explains the two factors (Similarity and Frequency) on the basis of which the errors are classified. A Design for a Fallible Machine:The fifth chapter of the book proposes the concept of fallible machine - an information machine that would work precisely for most of the time but would also produce errors at times. The purpose behind such a machine was to create an experimental model that would resemble the behaviours of humans and their tendency towards making the errors.
The Detection of Errors: This chapter pertains to the concepts of error detection and their correction. While talking about correction, the author explains two types of error correction mechanisms, namely the low level correction mechanism and the high level correction mechanism of which the former is proposed to work better than the latter. Latent Errors and Systems Disasters:In this chapter the author draws a schism in the types of errors in terms of the accidental contribution to their occurence.
Here the two types of errors are Active Errors and Latent
...Download file to see next pages Read More