Human error is often cited as a cause of accidents, when all other factors have been eliminated. This does not mean that human error cannot be investigated by scientific principles. In fact, today, there is considerable interest in researching human error . The aim of this article is to describe human errors and their relationships with occupational accidents.
The aim of this chapter is to define what is considered as “human error”. Another, the aim is to compare the traditional and modern views of human error.
It is very difficult to provide a satisfactory definition of human errors  as they are often a result of a complicated sequence of events and therefore an elusive phenomenon to analyse. However, Reason  has defined “human error” in the following way: "Error will be taken as a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some chance agency." On the other hand, it has been said that to err (i.e. to make mistakes) is human. Human error is an element that cannot be totally eliminated, but if the typical errors are identified, most of them can also be prevented.
According to the traditional viewpoint, human error is a cause of failure and accident. According to a new philosophical approach, human error is a symptom of failure, which reflects the deeper problems existing in a system. Examining human error provides information to delve beneath the simplistic label of 'human error'. Human error is an attribution after the fact, and it is systematically related to people, tools, tasks, and operating environment, .
Although there in no unanimous definition of human error, the general thinking has changed from attributing guilt to an individual towards a much more broad contextual approach.
One classification of human error regards them as 'Action errors' (action not as planned), which can be further categorised as 'slips' or 'lapses'; or as 'thinking errors' (action as planned) - classified as 'mistakes'. The inadvertent nature of such errors sets them apart from deliberate actions (known as 'violations') where an individual wilfully and knowingly adopts an incorrect course of action.
The aim of this chapter is to describe how to identify human error. First, the "Swiss Cheese" model will be presented. Subsequently different methods which can be used to identify the causes for human errors will be examined.
Accidents are rare
In the well-known "Swiss cheese" model, Reason  suggested that there are several intrinsic defences and atypical conditions preventing accidents. In an ideal world, each defensive layer would be intact. In reality, however, they are more like slices of Swiss cheese, having many holes. These holes are continually opening, closing, and shifting their location. An accident happens when the holes in many layers momentarily line up to permit a trajectory of accident opportunity. The main message of the "Swiss cheese" model is that the chance of danger factors finding all in the holes lined up in all of the defences at any one time is very small and that is why accidents are rather rare.
In a Swedish study, ten professional accident investigators were interviewed. They listed eight different meanings for human factors and concluded that there is no such thing as a professional definition of the human factor. The study concluded that the meanings of human factor 1) always evolve in the dynamic process of producing and understanding language, 2) are context-dependent, and 3) emerge though talk, as one type of discourse . The same comments are related also to the concept of human error.
The aim of the Cognitive Failures Questionnaire (CFQ) is to measure self-reported failures in perception, memory, and motor function . The scale was presented to 240 electrical workers in the United States Army. The CFQ predicted both car accidents and work accidents. When the foremen were asked to assess the workplace safety performance of 158 workers, the assessments of foremen and employees corresponded very well to each others’ (r = .79).
Based on the Cognitive Failure Questionnaire, Wallace and Chen  developed the Workplace Cognitive Failure Scale with 22 items like "Cannot remember whether or not you have turned off work equipment?" Using this scale, the researchers showed that general cognitive failure predicted unsafe behaviours and micro-accidents of American workers. Later with a smaller sample the same scale predicted supervisor safety ratings, injuries and missed work days.
The process of cognitive failure were also studied in British consumers. Typically shoppers forgot to buy an item, and this is why they had to return to the shop again. The second most common mistake among the consumers was to forget the shopping list at home. Older consumers reported fewer errors than their younger counterparts  – Age perhaps conferring experience on how to handle shopping and devising practical methods to avoid past mistakes.
These same researchers  also examined the tips-of-tongue phenomenon by analysing diaries, which volunteers kept for four weeks. The volunteers wrote down 75 tips of tongue experiences, which was an average of 2.5 tips per diarist. There were no gender differences in experiencing tips-of-the-tongue state. The object of the tip-of-the-tongue was a familiar person for the speaker in one out of three cases.
These studies revealed different methods to measure cognitive failures even for everyday situations. They also indicated that cognitive failures and processes were related to injuries and human errors.
The aim of this chapter is to examine factors that have an effect on human errors. The analysis is based on Rasmussen's  SRK (Skill – Rule – Knowledge) model:
- Skill-based behaviour represents sensorimotor performance automatically without conscious control. Work performance is based on subroutines which are subject to higher level control.
- Rule-based behaviour happens in a familiar work situation, where a consciously controlled stored rule is applied. Performance is goal-oriented, but structured by feed-forward control through a stored rule.
- Knowledge-based behaviour happens in unfamiliar situations, where a goal is explicitly formulated, based on an analysis of the environment and the overall aims of the person. The means must be found and selected according to the requirements of the situation.
In a study of British drivers, errors were defined as the failure of planned actions to achieve their intended consequences. Women drivers were more prone to harmless lapses, whereas male drivers reported more violations. The number of violations declined with age, but the number of errors did not decrease.
In the Serbian electric power company, human errors were analyzed by Absolute Probability Judgement. This is based on the assumption that people can directly assess their likelihood in the case of human error. Human errors with the highest probability of happening were failure to use prescribed tools and absence of job authorization. In the analysis of 500 reported pipe work incidents at a British chemical plant, 41% of immediate causes of incidents were of human origin and 31% were operating errors.
Hospitals are another work environment, where human errors can have fatal consequences. In the cardiology ward of a Japanese hospital 181 accidental and incidental events were reported during a six month period. A total of 40 of the reported events were classified as skill-based errors, 52 as rule-based errors, and seven incidents were designated as knowledge-based errors. A total of 12 errors were life threatening. Adverse drug events accounted for about 25% of human errors in hospitals. Most of accidents were human errors made by the doctors and nurses, in fact only 3-5% of errors were due to equipment.
Air traffic is one of the safety-critical industries, where the effect of human error has to be examined thoroughly. The majority of the commercial aviation accidents in the United States were due to the pilot error, of which over half were skill-based errors, over one third were decision errors, under one in every ten perceptual errors with final group being violations of regulations.
Aircraft mechanics in Australia reported 666 human errors. They spent 65% of their working time correcting skill-based errors, 32% were rule-based errors, and 3% as knowledge-based errors. Based on incident reports, researchers assessed that the reporting skill-based errors was more reliable than reporting rule- and knowledge-based errors. Subsequently, they examined a larger data set and revealed that only skill-based errors were related to occupational accidents. In addition, they reported that memory lapses, rule violations and knowledge-based mistakes were the most commonly identified human errors made by aircraft mechanics.
Skill-based errors were the most common unsafe act encountered in Australian mines. Inadvertent or missed operations were the most general types of skill-based errors. These errors were typically the result of a breakdown in visual monitoring or the inadvertent activation of a control.
Rasmussen's SRK model can help to identify the reasons for human error in a more detailed way compared to traditional general "human error" concept.
In a Japanese train company, drivers who made errors were required to participate in a mandatory training class. In order to avoid this “penalty” – a loss of face - the drivers did not report any mistakes. This practice led to over 100 fatalities in commuter train accidents. Thus, this organizational measure to criminalize drivers who had made a human error (by forcing them to participate to a training class) resulted in even more fatalities.
A similar effect is to be expected in relation to the Zero Accidents Vision displayed by some employers. If the will to prevent any accident is commendable, excessive pressures, conscious or not, can induce the employees and/or middle management not to report certain accidents to avoid direct or indirect sanctions. This can lead to not treat the causes of accidents that can later result in more serious effects.
The REVIEW method consisted of 16 measures of organizational health such as staff attitudes, departmental communication and training. For example, carelessness and inadequate training can increase the risk of human error. The method helped identify latent failures made by top management and line management has done leading to human error and accidents. This checklist was sent to Australian train drivers. Three problem factors were found: staff attitude, maintenance and operating equipment.
To summarize, there are some organizational factors which can influence employees’ behaviour so that they will make errors. Penalizing “human error” usually leads to hiding or denying that mistakes ever happened.
In everyday life, it is generally believed that human errors can cause injuries. This is confirmed by empirical studies.
It is generally accepted that 80-90% of accidents are due to human error. For example, approximately 70% of aircraft accidents have been attributed to human error. In a Finnish study, human errors were involved in 84% of serious accidents and in 94% of fatal accidents.
In the fatal occupational accidents which occurred in Australia, two out of three were due to skill-based errors, one fifth to rule-based errors and the other fifth to knowledge-based errors. Equipment work practices were relatively clearly related to rule-based errors, personal protective equipment to skill-based errors, and management unsafe procedures to knowledge-based errors. In fatal accidents on British construction sites, skill-based errors and knowledge-based errors both caused nine fatalities, whereas only three fatalities were due to rule-based errors.
In a recent Mexican study the safety experts documented 70 human factors causing hand injuries. These factors were classified as personal factors, human error, unsafe conditions, and organizational factors, respectively. The most frequent types classified as human error were improper handling of heavy objects, attempts to save time in conducting their operation, and the operator did not respect rules and procedures safety. That study did not contribute significantly to knowledge of human error, but it did highlight the current interest about "human error".
It is usually thought that errors are invariably negative, always to be avoided. The opposite approach is to conduct training that allows for the errors. When German typists were being taught to use computers, the subjects in the error-allowing-training group wrote fewer words and spent more time in correcting them than the subjects in the error-avoidance-training programme. However, the typists in the error allowing group dealt with a difficult task better than the control group.
These studies reveal that human error makes a significant contribution to occupational injuries. Thus prevention of human error is also one way to prevent occupational injuries.
In the prevention of human errors only a few practical, everyday means available for individual workers have been studied: 1) Drinking coffee helps to maintain vigilance, and 2)stress can increase the probability of errors, and thus reducing stress is another way to prevent accidents.
A Cochrane systematic review based on 17 studies showed that intake of caffeine could prevent human errors. Caffeine improves concept formation and reasoning, memory, orientation and attention and perception. Drinking coffee after a nap decreased significantly human errors among shift workers. On the other hand, the best reduction of human errors was achieved, when accident information was provided in such a way (for example Rasmussen´s  SRK model) that it corresponded employees' way of thinking.
A study with British Royal Navy personnel showed that highly stressed employees were more likely to suffer an accident in the workplace because they had a propensity to suffer cognitive failures. Since stress is a major source of human error, then reducing stress is one way to reduce human errors. Hurried working increases stress and accidents. Thus if one can slow down haste in the workplace this will be one way to reduce human errors.
Human error in the workplace is a common phenomenon, it can cause disturbances and accidents at work. Although there is no guaranteed method to prevent human errors, avoiding stress, and remaining focused by drinking coffee are the most often used, practical, everyday methods available to all.
As defined in the beginning, human errors are typically results of long chains of events, and preventing human error in workplaces requires different types of preventive actions: skills and safety awareness at the individual level about the risk factors of human errors, safety awareness and leadership provided by organizations (managers and supervisors recognizing the risk factors for human errors), and appropriate technical resources (safe design; solutions not requiring active human engagement such as handrails, light curtains etc.) - both being available in the markets (produced) and reasonable priced so that companies are able to afford the investment.
 Dekker, S., 'The criminalization of human error in aviation and healthcare: A review', Safety Science, Vol. 49, 2011, pp. 121-127.
 Rasmussen, J., Information processing and human-machine inter¬action. North-Holland, New York, 1986.
 Reason, J., Human error. Cambridge University Press, Cambridge, 1990.
 Dekker, S. W. A., 'Reconstructing human contributions to accidents: the new view on error and performance', Journal of Safety Research, Vol. 33, 2002, pp. 371-385.
 Woods, D. D., Dekker, S., Cook, R., Johannesen, L. & Sarter, N., Behind human error. Ashgate, Farnham, UK, 2010.
 HSE: Human failure types. Available at: http://www.hse.gov.uk/humanfactors/topics/types.pdf
 Reason, J., 'Human error: models and management', British Medical Journal, Vol. 320, 2000, pp. 768-770.
 Korolija, N. & Lundberg, J., 'Speaking of human factors: Emergent meanings in interviews with professional accident investigators', Safety Science, Vol. 48, 2010, pp. 157-165.
 Broadbent, D. E., Cooper, P. F., FitzGerald, P. & Parkes, K. R., 'The Cognitive Failures Questionnaire (CFQ) and its correlates', British Journal of Clinical Psychology, Vol. 21, 1982, pp. 1-16.
 Wallace, J. C. & Vodanovich, S. J., 'Can accidents and industrial mishaps be predicted? Further investigation into the relationship between cognitive failure and reports of accidents', Journal of Business and Psychology, Vol. 17, 2003, pp. 503-514.
 Wallace, J. C. & Chen, G., 'Development and validation of a work-specific measure of cognitive failure: Implications for occupational safety', Journal of Occupational and Organizational Psychology, Vol. 78, 2005, pp. 615-632.
 Reason, J. & Lucas, D., 'Absent-mindedness in shops: Its inci¬dence, correlates and consequences', British Journal of Clinical Psychology, Vol. 23, 1984, pp. 121-131.
 Reason, J. & Lucas, D., 'Using cognitive diaries to investigate naturally occurring memory blocks', In: J. E. Harris & P. E. Morris (Eds.), Everyday memory actions and absent-mindedness. Academic Press, London, 1984. pp. 53-70.
 Reason, J., Manstead, A., Stradling, S., Baxter, J. & Campbell, K., 'Errors and violations on the roads: a real distinction?', Ergonomics, Vol. 33, 1990, pp. 1315-1332.
 Stojiljkovic, E., Grozdanovic, M. & Stojiljkovic, P., 'Human error assessment in electric power company of Serbia', Work, 41, 2012, pp. 3207-3212.
 Hurst, N. W., Bellamy, L. J., Geyer, T. A. W. & Astley, J. A., 'A classification scheme for pipework failures to include human and sociotechnical errors and their contribution to pipework failure frequencies', Journal of Hazardous Materials, Vol. 26, 1991, pp. 159-186.
 Narumi, J., Miyazawa, S., Miyata, H., Suzuki, A., Kohsaka, S. & Kosugi, H., 'Analysis of human error in nursing care', Accident Analysis and Prevention, Vol. 31, 1999, pp. 625-629.
 Spencer, F. C., Human error in hospitals and industrial accidents: Current concepts', Journal of the American College of Surgeons, Vol. 191, 2000, pp. 410-418.
 Gaba, D. M., 'Human error in anesthetic mishaps', International Anesthesiology Clinics, Vol. 27, 1989, pp. 137-147.
 Shappell, S., Detwiler, C., Holcomb, K., Hackworth, C., Boquet, A. & Wiegmann, D. A., 'Human error and commercial aviation accidents: An analysis using the Human Factors Analysis and Classification System', Human Factors, Vol. 49, 2007, pp. 227-242.
 Hobbs, A. & Williamson, A., 'Skills, rules and knowledge in aircraft maintenance: errors in context', Ergonomics, Vol. 45, 2002, pp. 290-308.
 Hobbs, A. & Williamson, A., 'Unsafe acts and unsafe outcomes in aircraft maintenance', Ergonomics, Vol. 45, 2002, pp. 866-882.
 Hobbs, A. & Williamson, A., 'Associations between errors and contributing factors in aircraft maintenance', Human Factors, Vol. 45, 2003, pp. 186-201.
 Patterson, J. M. & Shappell, S. A., 'Operator error and system deficiencies: Analysis of 508 mining incidents and accidents from Queensland, Australia using HFACS', Accident Analysis & Prevention, Vol. 42, 2010, pp. 1379-1385.
 Chikudate, N., 'If human errors are assumed as crimes in a safety culture: A lifeworld analysis of a rail crash', Human Relations, Vol. 62, 2009, pp. 1267-1287.
 Reason, J., 'A systems approach to organizational error', Ergonomics, Vol. 38, 1995, pp. 1708-1721.
 Edkins, G. D. & Pollock, C. M., 'Pro-active safety management: Application and evaluation within a rail context', Safety Science, Vol. 24, 1996, pp. 83-93.
 Hale, A. R. & Glendon, A. I., Individual behaviour in the control of danger. Elsevier, Amsterdam, 1987.
 Feggetter, A. J., 'A method for investigating human factor aspects of aircraft accidents and incidents', Ergonomics, Vol. 25, 1982, pp. 1065-1075.
 Salminen, S. & Tallberg, T., 'Human errors in fatal and serious occupational accidents in Finland', Ergonomics, Vol. 39, 1996, pp. 980-988.
 Feyer, A.-M., Williamson, A. M. & Cairns, D. R., 'The involvement of human behaviour in occupational accidents: Errors in context', Safety Science, Vol. 25, 1997, pp. 55-65.
 Hale, A., Walker, D., Walters, N. & Bolt, H., 'Developing the understanding of underlying causes of construction fatal accidents', Safety Science, Vol. 50, 2012, pp. 2020-2027.
 Reyes-Martinez, R. M., Maldonado-Macias, A. & Prado-León, L. R., 'Human factors identification and classification related to accidents' causality on hand injuries in the manufacturing industry', Work, Vol. 41, 2012, pp. 3155-3163.
 Frese, M., Brodbeck, F., Heinbokel, T., Mooser, C., Schleiffenbaum, E. & Thiemann, P., 'Errors in training computer skills: On the positive function of errors', Human-Computer Interaction, Vol. 6, 1991, pp. 77-93.
 Ker, K., Edwards, P. J., Felix, L. M., Blackhall, K. & Roberst, I., 'Caffeine for the prevention of injuries and errors in shift workers', Cochrane Database of Systematic Reviews, Issue 5., 2010.
 Sanderson, P. M. & Harwood, K., 'The skills, rules and knowledge classification: a discussion of its emergence and nature', In: L. P. Goodstein, H. B. Andersen & S. E. Olsen (Eds.), Tasks, errors and mental models. Taylor & Francis, London, 1988. pp. 21-34.
 Day, A. J., Brasher, K. & Bridger, R. S., "Accident proneness revisited: The role of psychological stress and cognitive failure," Accident Analysis and Prevention, Vol. 49, 2012, pp. 532-535.
 Guidelines for prevention of human error abroad ships – Through the ergonomic design of marine machinery system, Nippon Kaiji Kuokai, Japan, 2010. Available at: https://www.dieselduck.info/machine/06%20safety/2010%20Class%20NK%20guidelines%20prevention%20human%20error.pdf