Human Factors


14
Human Factors


Lorna Harron


Harron Risk Management Ltd., Edmonton, Alberta, Canada


14.1 Introduction


As important as the technical components of a design, construction, operation, and maintenance program is the human component of the activities being performed. According to the International Labor Organization Encyclopedia of Occupational Health and Safety [1], a study in the 1980s in Australia identified that at least 90% of all work-related fatalities over a 3-year period involved a behavioral factor. When an incident investigation is performed, human factors are often lumped into the category of “human error” as either the immediate cause or a contributory cause. While this acknowledges the human as a contributor to an undesirable event, it does not provide sufficient details to facilitate reduction of the potential for recurrence of the human error. When up to 90% of incidents are related to a human factor, it is important to understand human factors and how to reduce the potential for human error in our business. This chapter is designed to provide the reader with a better understanding of human factors and how to manage the potential for human error in his/her organization.


14.2 What Is “Human Factors”?


“Human factors” is the study of factors and tools that facilitate enhanced performance, increased safety, and/or increased user satisfaction [2]. These three end goals are attained through the use of equipment design changes, task design, environmental design, and training. “Human factors” is therefore goal oriented, focused on achieving a result rather than following a specific process. When studying human factors, the end product focuses on system design, accounting for those factors, psychological and physical, that are properties of the human element. “Human factors” centers the design process on the user to find a system design that supports the user’s needs rather than making the user fit the system.


Other technical fields interact with or are related to human factors, such as cognitive engineering and ergonomics. Cognitive engineering focuses on cognition (e.g., attention, memory, and decision making). Cognitive engineering impacts how a human or machine thinks or gains knowledge about a system. Ergonomics is related to human factors, focusing on the interaction of the human body with a workspace or task.


It is well understood in industry that improvements at the design stage of a project cost significantly less than improvements postconstruction, so management of human factors at the design stage of a project will provide greatest value (Table 14.1).


14.3 The Human in the System


Designing for the human, known as user-centered design, permits the creation of systems that work with the human in mind. Rather than fitting the human into a system, a system is designed based on how the human will use it.


When considering implementation of a new technology, and how a human will use the technology, human–system integration (HSI) is a key component of the user-centered design. HSI is a system design approach intended to ensure that human characteristics and limitations are considered, with an emphasis on selection of the technology, training, how the human will use the system in operation, and safety [3].


A methodology for defining the system, so that interactions between the human and the system can be defined, is the SHEL model (Figure 14.1), a human factors framework model that divides human factors issues into four components, software, hardware, environment, and liveware (people) [4]. Software refers to the processes and procedures used in the system. Hardware refers to the physical hardware, such as controls, displays, and computers used within the system. Environment refers to natural, social, and economic environment in which the system is used, including culture. Liveware refers to the person(s) that work and interact within the system. In using the SHEL model, the interactions among the four components are considered with a focus on interactions that affect human performance within the system.


Table 14.1 Benefits of Incorporating Human Factors at the Design Stage







  1. Increased sales
  2. Decreased development costs
  3. Decreased maintenance costs
  4. Decreased training cost
  5. Increased productivity
  6. Fewer errors by human, machine, or human–machine interface
  7. Increased employee satisfaction
  8. Decrease in number of incidents
  9. Decrease in medical expenses and number of sick days taken by employees

14.4 Life Cycle Approach to Human Factors


In the pipeline industry, human factors can create the potential for a human error at many points along the life cycle of a pipeline. At the Banff/2013 Pipeline Workshop, a tutorial on human factors was conducted using a life cycle analysis to identify where there is a potential for human error. Figure 14.2 illustrates the results of this discussion.


Using a life cycle approach to manage human factors provides an organization with the capability to integrate human factors into programs, standards, procedures, and processes using a disciplined approach. To use the human factors life cycle, organizations may wish to assess themselves against the areas identified for the presence of programs, standards, procedures, and processes to manage human error potential.


An examination of the potential for human error using a life cycle approach also highlights the interrelationship between human factors management and management systems. Common management system elements, such as change management and documentation, are reflected in each stage of the human factors life cycle.


Opportunities exist to reduce the potential for human error in design, construction, operation, maintenance, and decommissioning through the use of robust quality control processes, such as self-assessments. Incorporation of human factors assessment techniques such as job hazard analysis, job task analysis, human factors failure modes and effect analysis, and workload analysis can help operators understand their current state and potential gaps from a desired state. Specialists in human factors can be engaged to assess and mitigate any/all of the areas identified in the life cycle provided in Figure 14.2.


To illustrate this, an example case study is provided in the following.


14.4.1 Example Case Study


XYZ Pipeline Company has a pipeline integrity program that includes threat mitigation for corrosion, cracking, deformation and strain, third-party damage, and incorrect operations. XYZ Pipeline Company has decided to perform a gap analysis for the integration of human factors using the life cycle approach. In the “maintain” category of the life cycle, the company is confident that they use the best available technology to monitor and assess the integrity condition of their pipelines. They use Vendor A, who is a leader in inspection technology in the industry, to perform pipeline inspections and to provide detailed analyses on the results of these inspections. Using the life cycle approach, XYZ Pipeline Company evaluates how they have incorporated human factors into the identified areas. First, they ask whether there is a potential for human error in the determination of inspection frequency. The process used to determine inspection frequency involves a review process by at least three senior engineering professionals, and it is determined that the potential for an error at this point is minimized through the existing process. The next question asked is whether there is potential for human error in the selection of the inspection tool used to evaluate the threat. The existing process relies on the expertise and experience of integrity management engineers at the company. While today these individuals have over 20 years of industry experience, some are preparing to retire in the next 2–5 years. This risk is highlighted as a potential gap and action planning identifies the need to create a guideline for tool selection. During the discussions of this topic, it is also identified that the technology is changing rapidly and that understanding tool capability requires constant monitoring of the inspection market. Another action is then identified to create a work task for a senior engineer to monitor the inspection market at least annually to ensure the guideline is current. Next, they question the potential for an error in the inspection report that they base integrity decisions on. This discussion highlights several potential gaps, including a lack of quality control on the vendor reports received by XYZ Pipeline Company, a lack of understanding of the experience and expertise of the people performing the defect analysis within the tool inspection company, the lack of expertise in XYZ Pipeline Company in reading and understanding defect signals for all the inspection tools, and the lack of processes between the time the tool inspection is performed and the mitigation plan is prepared. Action planning for these gaps required creation of a small task force to focus on mitigating actions to manage these gaps. At this point, XYZ Pipeline Company decided that this was as far as they could progress in a single year, so a multiyear assessment protocol was developed to review the integrity management program over a 5-year term to assure integration and management of human factors through the life cycle of their pipelines.

A diagram of four interlocking puzzle pieces labeled: Hardware H with computers, displays, and functional systems. Software S with rules, procedures, and processes. Livewire L with humans within the system. Environment E with natural operating environment, social and economic climate

Figure 14.1 SHEL model.


(Adapted from ICAO Doc 9859 Safety Management Manual (2012).)

A chart titled human factors life cycle outlining potential areas for human error across five stages: Design. Build. Operate. Maintain. Decommission. Each stage lists specific considerations, with additional common elements like change and communication management below.

Figure 14.2 Human factors life cycle, Banff/2013 pipeline workshop.


14.5 Human Factors and Decision Making


Human factors impact a pipeline system when a nonoptimal decision has been made. There are many factors that impact how information is processed and subsequent decision making.


Human factors impact how we process information for decision making. The steps involved in decision making include receipt of information, processing of information, and deciding on a course of action (the decision). Receipt of information focuses on information cues, while processing of information focuses on memory and cognition [2]. Deciding on a course of action involves heuristics (simplifying decision-making tools) and biases.


14.5.1 Information Receipt


Information is received into the body as cues. These cues may be visual such as printed words in a procedures document, or auditory such as speech or alarms. The quality of the cues provided will increase the probability of receipt of the information into awareness and eventually action. Enhancement of cues can be provided through targeting, which involves making a cue conspicuous and highly visible while maintaining the norms for a cue. For example, if a cue to an operator to perform an action is a light turning on, then from a human factors perspective a few simple steps can ensure that the appropriate action is clear to the operator. First, an evaluation of what action the cue is supposed to elicit is required. If the light indicates an upset condition and the resulting action is to turn off a device, then the normal color anticipated by the operator would be red. If the light indicates a desired state and the action is to start up a piece of equipment, then the normal color anticipated by the operator would be green. If the colors used were not consistent with the expectations of the operator, then the likelihood of a human error such as an incorrect reaction to the cue is heightened.


An example of expectancy is what we anticipate we will see at a traffic light. If we see the traffic light on the left-hand side of Figure 14.3, this meets our expectation of what a traffic light should look like, considered as meeting our norms. The traffic light on the right-hand side of the figure, however, does not meet our expectations and norms and would therefore be likely to contribute to a human error and subsequent traffic accident.


14.5.2 Information Processing


The manner in which we process information is impacted through our memory systems. Prior knowledge is stored in long-term memory and information is manipulated in working memory.

Two side-by-side vertical column diagrams each with three circles.

Figure 14.3 Expectancy.


Working memory is like a shelf of products that are available for a short period of time but that require a lot of effort to maintain. Information processing depends on cognitive resources, which is a pool of attention or mental effort available for a short period of time that can be allocated to processes such as rehearsing, planning, understanding, visualizing, decision making, and problem solving. How much effort we put into working memory will be based on the value we place on the information. If we put effort into only a few areas, the result is “selective attention.” Effort may be influenced by fatigue, where the effort to perform a task is perceived to be greater than the anticipated value. There are some limitations to working memory based on its transitory nature and limited amount of cognitive resources. Working memory is limited to 7 ± 2 chunks of information [3], where a chunk of information may be grouped information such as a phone number, area code, or PIN for your computer. Working memory is also limited by time, decaying as time progresses. Another limitation of working memory is confusion of similar pieces or chunks of information, causing erroneous information to be processed. When translated to an operating environment, the amount of activity in the operating environment can impact the ability to process information in working memory effectively. Some tips for managing working memory are listed in Table 14.2.


Long-term memory is a way to store information for retrieval at a later time. When we learn something, we are processing stored information into long-term memory. Training is specific instruction or procedures designed to facilitate learning. Long-term memory is distinguished by whether it involves memory for general knowledge (semantic memory) or memory for specific knowledge (event memory) [4].


Material in long-term memory has two important features that determine the ease of later retrieval: its strength and its associations. Strength is determined by the frequency and recency of its use. For example, a PIN or password used frequently will have strength, and if the PIN or password is associated with something relevant to the individual, it will also have an association and therefore be easier to recall.


Table 14.2 Tips to Manage Working Memory







  1. Minimize working memory load by using written information rather than verbal
  2. Provide visual echoes/redundancy
  3. Provide placeholders for sequential tasks
  4. Exploit chunking:

    • physical chunk size three to four letters or numbers per chunk
    • meaningful sequences
    • letters better than numbers
    • separating numbers from letters

  5. Minimize confusability/use of negation (do not)
  6. Consider working memory limits in instructions

Associations occur when items retrieved in long-term memory are linked or associated with other items—if associations are not repeated, the strength of association weakens over time. Tips for managing long-term memory are listed in Table 14.3.


Metaknowledge or metacognition refers to people’s knowledge or understanding about their own knowledge and abilities. It impacts people’s choices, in the actions they choose to perform, the information or knowledge they choose to communicate, and choices of whether additional information is required to make a decision. Knowledge can be based on perceptual information and/or memory. The accuracy of information may be greater for perceptual information, but it may require greater effort than the use of information stored in memory. According to resource theory [5], we make decisions under limited time with limited information. Our scarce mental resources are shared by various tasks, and more difficult tasks require greater resources. Some concurrent tasks may therefore see a performance decline (see Table 14.4).


Table 14.3 Tips to Manage Long-term Memory







  1. Encourage regular use of information to increase frequency and recency
  2. Encourage active verbalization or reproduction of information that is to be recalled
  3. Standardize
  4. Use memory aids
  5. Carefully design information to be remembered
  6. Design to support development of correct mental models

Table 14.4 Ways to Address a Multitasking Environment



















Method Results/impact
1) Task redesign Reduce working memory limitations
2) Interface redesign Direct selective attention (e.g., use voice for tasks when eyes are required for a different task)
3) Training Develop automaticity through repeated and consistent practice; improve resource allocation through training in attention management
4) Automation Limit resource demands through number of tasks; direct selective attention

14.5.2.1 Stressors


Information processing is influenced by stressors that are outside of the information itself. These stressors may be physical, environmental, or psychological, such as life stressors and work or time stressors [2]. Examples of common stressors include noise, vibration, heat, lighting, anxiety, and fatigue.


The effect of stressors may be physical and/or psychological. Some stressors produce a physiological reaction, such as raised heart rate, nausea, or headache, while others may produce a psychological response such as mood swings. The duration and intensity of a stressor may lead to long-term negative effects on the health of the individual. In most cases, stressors affect information receipt and processing.


Work Overload

Work overload occurs when there are more decisions to be made than cognitive resources available to make optimal decisions. When an overload is encountered, the decision maker becomes more selective in the receipt of cues. Limiting the inputs received through a selective attention filtering process allows the decision maker more cognitive resources for the task at hand—making a decision! Another result of work overload may be weighing information that is received to limit the amount of information that is processed. This will result in better use of available cognitive resources, if the screening is performed accurately. The filtering and screening processes will result in choosing the “easiest” decision. The results of this could be less accuracy in the decision-making process and potentially cognitive tunneling or fixation on a specific course of action [6]. Work overload can manifest in fatigue or in sleep disruption.


Stress can occur because we have too many things to do in too short a time. Management of work overload as a stressor requires an evaluation of current workload [7]. To determine workload, consider which tasks need to be accomplished, when these tasks need to be performed, and how long the typical tasks take to accomplish. When considering the workload, consideration of time to perform a task needs to include cognitive tasks such as planning, evaluating alternatives, and monitoring success (Table 14.5). It is difficult to quantify the impact of workload. Will each individual perform a task exactly the same way? Will individuals process information and make decisions at the same speed? Are there other distractions or overlapping information that would impact the task and the speed of accomplishing a task?


Table 14.5 Tips to Manage Workload Overload







  1. Critical task analysis to determine task priorities and time requirements
  2. Task redesign (redistribute tasks to others; automate some tasks; reduce potential for overlapping or interfering tasks)
  3. Develop a display design so that the most important tasks are obvious to the user (e.g., red alarm for a control panel)
  4. Training on tasks to increase speed
  5. Training on time management skills

Fatigue

Fatigue is the state between being alert and being asleep. It may be mental, physical, or both in nature. In the pipeline industry, it is commonly understood that long shifts, in particular night shifts, may result in fatigue [8]. When fatigue occurs, performance level will be impacted. Management of fatigue includes scheduling rest breaks and enforcement of maximum hours to work in one day (Table 14.6).


Sleep disruption is a major contributor to fatigue. Sleep disruption is influenced by the amount of time you sleep each night. Sleep deprivation occurs when an individual receives less than 7–9 h of sleep per night. For shift workers, sleep disruption occurs when tasks are performed at the low point of natural circadian rhythms, which equates to the early hours of the morning. Circadian rhythms are the natural cycles of the body regulated by light/darkness that impact physiological functions such as body temperature, sleeping, and waking [9]. For those who travel extensively, disruption of circadian rhythms may occur from jet lag. Understanding the impact of sleep disruption is important when evaluating the potential for a human error to occur. Tasks requiring input from visual cues are impacted by sleep disruption. Vigilance tasks, common for control center operations, may result in inattention and poor decision making as a result of sleep disruption.


Table 14.6 Tips to Manage Fatigue







  1. Ensure adequate hours of sleep
  2. Napping when feeling the impact of sleep disruption (at least 15 min in duration)
  3. Training on fatigue management for shift workers
  4. Avoid stimulants, such as caffeine

Vigilance

Vigilance is defined as the detection of rare, nearthreshold signals in noise. In a control room, the rare, nearthreshold signal may be a small fluctuation in a process variable that precedes a larger change that is picked up by the alarm system. Flying an airplane is considered to be a vigilance task, with takeoff and landing as the two main tasks accomplished while in flight. Monitoring of the equipment gauges requires some mental effort, but action is only required when a nonstandard condition arises. Similar to this aviation example, in the pipeline industry a vigilance task may be monitoring a control panel in a control center. A concern with vigilance tasks is the increased number of misses that may occur in relation to the length of the vigilance task, called vigilance decrement. For a control center operator, management of alarms or other cues on a regular basis rather than having long periods of time between cues would reduce this concern. This and other tips for managing vigilance are listed in Table 14.7.


Lighting and Noise

Lighting and noise are two environmental stressors that can be remedied in a work environment (Table 14.8). Performance can be degraded when the amount of light and/or noise is too high or too low [2]. If continued high exposure (e.g., noise) or low exposure (e.g., light) occurs, then health issues may also result. These effects may not be obvious until some point in the future, so correlation with this stressor may be difficult.


Table 14.7 Tips to Manage Vigilance







  1. Limit vigil duration (length of time between rest breaks)
  2. Enhance cues/signals to make them more obvious
  3. Use of music, noise, and conversation
  4. Ensure operators are not sleep deprived
  5. Use automation carefully, as this may create vigilance tasks

Table 14.8 Tips to Manage Lighting and Noise







  1. Reduce glare on visual surfaces such as computer screens
  2. Ensure sufficiently high level of light for operating tasks
  3. Promote use of sunglasses for appropriate conditions and tasks (e.g., driving)
  4. Use of noise-muffling devices for loud operating environments (>85 dB)
  5. Consider background noise levels when designing sounds that require an action (e.g., alarm or voice command)
  6. Follow alarm guidelines in appropriate regulations (e.g., control room management guidelines)
  7. Minimize potential for false alarms that could create complacency

Readers who wish to apply the above tips can refer to the following guidance documents:



  • Criteria for a Recommended Standard: Occupational Exposure to Noise, U.S. Department of Health, Education and Welfare, National Institute for Occupational Safety and Health, 1972.
  • Attwood, D., Deeb, J., and Danz-Reece, M. Ergonomic Solutions for the Process Industries, Gulf Publishing (Elsevier), Burlington, MA, 2004 (Chapter 4).

Vibration

Vibration is an environmental stressor that can lead to both performance issues and health issues (e.g., repetitive motion disorders and motion sickness) [10]. High-frequency vibration can impact either a specific portion of the body or the whole body. Types of tasks that affect whole-body vibration include off-road driving and working around large compressors or pumps. The impact of vibration that affects the whole body includes stress, risk of injury [10], difficulty using devices such as touchscreens or handheld devices, difficulty with eye–hand coordination, and blurred vision. At lower vibration frequencies, the impact of vibration can be motion sickness that in turn impacts performance through inattention. Tips for managing vibration are listed in Table 14.9.


Temperature

Temperature impacts performance and health when it is either too low or too high [11]. Both excessive heat and excessive cold can produce performance degradation and health problems. Health issues can include a physical reaction to excessively high or low temperatures, such as heatstroke and frostbite. The performance impact can be cognitive, focusing on how information is processed by an individual. For managing problems related to temperature, some tips are listed in Table 14.10.


Table 14.9 Tips to Manage Vibration







  1. Ensure that text fonts are larger than the minimum specified for stable environments
  2. Insulate the user/interface from vibration, for example, hand-tool vibration coverings, improved seating for off-road vehicles

Table 14.10 Tips to Manage Temperature







  1. Ensure air movement in hot work environments through the use of fans or natural air currents/breeze
  2. Reduce physical work requirements in extreme hot or cold temperatures
  3. Reduce amount of time exposed to extreme hot or cold temperatures
  4. Ensure there is lots of water/fluids to replenish hydration
  5. Choose clothing appropriate for the environment

Air Quality

Air quality issues can negatively impact performance and health of an individual. Air quality is not limited to internal spaces, such as an office, but also external spaces such as a facility. Poor air quality can result from inadequate ventilation, pollution, smog, or the presence of carbon monoxide. Poor air quality can impact cue recognition, task performance, and decision-making capability [12].


Psychological Stress

Psychological stress can be triggered by a perception of potential harm or potential loss of something of value to the individual. It is extremely difficult to design for this type of stressor, since it is related to perception, and perception varies with individuals. The natural question is “What impacts the perception of harm or loss to an individual?” The way a person can interpret a situation will impact the stress level they feel. Cue recognition and information processing are key elements of this interpretation. Psychological stress is a real phenomenon that is hard to express and manage (Table 14.11). Physiological reactions may occur due to the psychological stressors, such as elevated heart rate. At an optimal amount of psychological stress, we can see performance improvement. When this optimal level is exceeded, however, the result can be elevated feelings of anxiety or danger, or “overarousal.” Information processing when in a state of overarousal can be altered, impacting memory and decision-making capability.


Life Stress

Life itself may be stressful outside the work environment. Life events, such as a health concern with a loved one, can impact work performance and potentially cause an incident. When an incident occurs related to human factors, the classification may be “inattention.” The cause of inattention is likely the diversion of attention from work tasks and toward the source of life stress. Some tips for managing life stress are listed in Table 14.12.


Table 14.11 Tips to Manage Psychological Stress







  1. Simplification of displays, controls, and procedures to impact cue reception and information processing
  2. Training on stressors to build individual awareness of psychological stress triggers

Table 14.12 Tips to Manage Life Stress







  1. Implement stress management programs
  2. Provide counseling services
  3. Train leaders in the identification and management of stress in employees

14.6 Application of Human Factors Guidance


Guidelines for the management of working memory are available in other industries, including aviation, medicine, and road transportation. The following are some examples of references for the reader to explore and apply.



  1. HARDIE design guidelines handbook: Human Factors Guidelines for Information Presentation by ATT Systems, 1996.
  2. Development of human factors guidelines for advanced traveler information systems and commercial vehicle operations: Driver Memory for In-Vehicle Visual and Auditory Messages. Publication No. FHWA-RD-96-148, 1996.
  3. NCHRP report 600: Human Factors Guidelines for Road Systems, 2nd ed., 2012.
  4. Do it by design: An Introduction to Human Factors in Medical Devices, U.S. Department of Health and Human Services.
  5. Human Factors Guidelines for Aircraft Maintenance Manual. ICAO Document 9824 AN/450, 2003.
  6. Health and Safety Executive—Managing Shift Work: Health and Safety Guidance, 2006.
  7. Applying Human Factors and Usability Engineering to Optimize Medical Device Design—Draft, 2011.

14.7 Heuristics and Biases in Decision Making


Decision making is a task involving selection of one option from a set of alternatives, with some knowledge and some uncertainty about the options. When we make decisions, we plan and select a course of action (one of the alternatives) based on the costs and values of different expected outcomes.


There are various decision-making models in the literature, including rational or normative models, such as utility theory and expected value theory, and descriptive models [13, 14]. If there is good information and lots of time available to make a decision, a careful analysis using rational or normative models is possible. If the information is complex and there is limited time to make a decision, simplifying heuristics are often employed. The focus of this section is on descriptive models such as heuristics and biases.


When decisions are made under limited time and information, we rely on less complex means of selecting among choices [15, 16]. These shortcuts or rules of thumb are called heuristics. The choices made using heuristics are often deemed to be “good enough.” Because of the simplification of information receipt and processing employed using heuristics, they can lead to flaws or errors in the decision-making process. Despite the potential for errors, use of heuristics leads to decisions that are accurate most of the time. This is largely due to experience and the resources people have developed over time.


Cognitive biases result when inference impacts a judgment. Biases can result from the use of heuristics in decision making.


There are a number of specific heuristics and biases that will be explored in this section, including:



  • satisficing heuristic;
  • cue primacy and anchoring;
  • selective attention;
  • availability heuristic;
  • representativeness heuristic;
  • cognitive tunneling;
  • confirmation bias;
  • framing bias.

14.7.1 Satisficing Heuristic


As may be surmised from the name, the satisficing heuristic is related to a decision that satisfies the needs of the user. The person making the decision will stop the inquiry process when an acceptable choice has been found, rather than continuing the inquiry process until an optimized choice has been found [17]. This heuristic can be very effective, as it requires less time. As it does not provide an optimal solution, however, it can lead to biases and poor decisions. This heuristic can be advantageous when there are several options for achieving a goal and a single optimized solution is not required, such as a decision on providing an example of a data set for display in a presentation. This heuristic could lead to less than optimal result when the decision requires an extensive review to determine a single solution, such as a decision to choose a method for defect growth prediction.


14.7.2 Cue Primacy and Anchoring


This heuristic is illustrated when the first few cues that have been received receive a greater weight or importance versus information received later [15]. This leads the decision maker to support a hypothesis that is supported by information received early. This heuristic could lead to decision errors if extensive analysis or increased depth of analysis occurs as part of the inquiry process. For example, the use of cue primacy and anchoring would not be appropriate for decision making during a root cause investigation.


14.7.3 Selective Attention


Selective attention occurs when the decision maker focuses attention on specific cues when making a decision [2]. A decision maker has limited resources to process information, so we naturally “filter out” information and focus attention on certain cues. If an operator has many alarms occurring at once, then focusing on a few selective cues/alarms influences the action that is taken. Selective attention is impacted by four factors—salience (prominence or conspicuous), effort, expectancy, and value. Cues that are more prominent, such as cues at the top of a display screen, will receive attention when there are limited resources and become the information used to make a decision. Selective attention is an advantage in a well-designed environment where many potential cues are present. However, it can lead to poor decision making if the important cues are not the most salient in the environment.


14.7.4 Availability Heuristic


The meaning of the availability heuristic is represented by its name. Decision makers generally retrieve information or decision-making hypotheses that have been considered recently or frequently [18]. This information comes foremost to the mind of the decision maker and decisions can be made based on the ability to recall this information easily and quickly. For example, if a pipeline designer has recently completed a design for a pipe in rocky conditions and then asked what considerations should be included in the design of a different pipeline, the designer will readily recall soil conditions and comment that this should be considered for the new project. This heuristic is of great benefit when brainstorming various options to consider, but can be limiting if a full analysis of all options is required.


14.7.5 Representativeness Heuristic


Once again, the meaning of the representativeness heuristic is as its name implies. Decision makers will focus on cues that are similar to something that is already known [16]. The propensity is to make the same decision that was successful in a previous situation, since the two cues “look the same.” For example, when an analyst is evaluating in-line inspection (ILI) data, he/she may see two signals that look very similar and due to the representativeness heuristic use the same characterization for both features. When the two images are identical, this heuristic provides a fast and effective means of decision making, or in this case characterizing features. If, however, the signals are similar but not identical, then the representativeness heuristic may lead to a decision-making error.


14.7.6 Cognitive Tunneling


Cognitive tunneling may be colloquially known as “tunnel vision.” In this instance, once a hypothesis has been generated or a decision chosen to execute, subsequent information is ignored [19]. In decision making, cognitive tunneling can easily lead to a decision error. To avoid cognitive tunneling, the decision maker should actively search for contradictory information prior to making a decision. For example, asking for a “devil’s advocate” opinion is one means of testing a hypothesis prior to finalizing a decision.


14.7.7 Confirmation Bias


In confirmation bias, the decision maker seeks out information that confirms a hypothesis or decision [20]. Similar to cognitive tunneling, focusing only on confirming evidence, and ignoring other evidence that could lead to an optimal decision, can lead to a decision error. When relying on memory, a tendency to fail to remember contradictory information, or to underweigh this evidence in favor of confirming evidence, impacts our ability to effectively diagnose a situation, prepare a hypothesis, and subsequently make an optimal decision. The main difference between cognitive tunneling and confirmation bias is that cognitive tunneling ignores all information once a hypothesis or decision has been chosen. With confirmation bias, however, information is filtered for only that information that confirms the chosen hypothesis or decision.


14.7.8 Framing Bias


While many of the biases that occur are on a subconscious level, it is possible to use framing bias deliberately when trying to influence others. Framing bias involves influencing the way material is received to impact the decision that is made by another person [21]. For example, when writing a procedure about a requirement to perform an activity under specific circumstances, there is a framing bias imposed if the wording indicates the activity is not done unless certain criteria are met instead of that the activity is done when certain criteria are met. The decision maker is influenced to decide that the activity is not performed, rather than performed, based on the selected wording.


Table 14.13 Factors that Influence Decision Making







  1. Inadequate cue integration
  2. Inadequate or poor-quality knowledge the person holds in long-term memory
  3. Tendency to adopt a single course of action and fail to consider the problem broadly
  4. Working memory capacity; limits to attention
  5. Poor awareness to changing situation
  6. Poor feedback

14.7.9 Management of Decision-Making Challenges


Some factors that influence decision making are listed in Table 14.13. The most effective means of correcting poor decisions is through feedback that is clearly understood with some diagnostics included. In many organizations, the largest contributor to incidents involves motor vehicles. When driving, it is difficult to learn from previous behavior and make better decisions due to the lack of feedback received while performing the task. When operating a pipeline, the operator receives feedback through alarms to indicate if a decision was effective. The alarm feedback may not lead to an optimal decision; however, the use of simulators to provide feedback enhances optimal decision making.


14.7.9.1 Methods to Improve Decision Making


There are many ways, as we have described, to negatively impact decision making and subsequently increase the potential for human error to occur. The process of making a decision is influenced by design, training, and the presence of decision aids [2]. We often jump to the conclusion that poor performance in decision making means that we must do something “to the person” to make him or her a better decision maker. The category of “human error,” when used during an incident investigation, tends to result in activities related to the person who was performing the task at the time the incident occurred. A more holistic approach would be to consider the human being as well as opportunities to improve the way this human being could make a decision in a similar situation. Such an approach involves, potentially, task redesign, the use of decision support systems, and training.


A decision support system is an interactive system designed to improve decision making by extending cognitive decision-making capabilities. These support systems can focus on the individual, using rules or algorithms that require a consistent decision to be made by an individual. For tasks that result in a pass/fail decision with little deviation in the information received this approach works well, since decision consistency is more important than optimizing decisions made in unusual situations [22]. Focus on the individual can fail when there is a potential for unusual/nonroutine information receipt or when data entry mistakes can occur.


Alternatively, decision support systems can focus on tools to support decision making. For tasks that require the ability to make decisions in the presence of unusual/nonroutine information, this type of support system can complement the human in the decision-making process.


Examples of tools to support decision making include decision matrices for a risk management approach to decision making, spreadsheets to reduce cognitive loading, simulations to perform “what-if analyses,” expert systems such as computer programs to reduce error for routine tasks, and display systems to manage visual cues. Some of these will utilize automation, so it is important to keep in the mind the potential to create vigilance tasks. Balancing the amount of automation for optimized decision making is an art rather than a science.


14.7.9.2 Case Study: USS Vincennes


Overview

On July 3, 1988, the USS Vincennes accidentally shot down an Iranian airbus (Flight 655) resulting in the death of 290 people [23, 25]. This occurred in the Strait of Hormuz in the Persian Gulf. Aboard the ship, the cruiser was equipped with sophisticated radar and electronic battle gear called the AEGIS system. The crew tracked the oncoming plane electronically, warned it to keep away, and, when it did not, fired two standard surface-to-air missiles. The Vincennes’ combat teams believed the airliner to be an Iranian F14 jet fighter. No visual contact was made with the aircraft until it was struck and blew up about 6 mi. from the Vincennes. Although the United States and Iran were not at war, there were some short but fierce sea battles in the Gulf between the United States and Iran. The year before, on May 17, there was an incident involving the near-sinking of the USS Stark by an Iraqi fighter-bomber. The Iraqi fighter launched a missile attack that killed 37 U.S. sailors. Information recently received from the U.S. intelligence predicted a high-profile attack for July 4th, the day after this incident occurred. The USS Vincennes and another U.S. vessel were in a gunfight with an Iranian gunboat when the “fighter” appeared on the radar screen. The airstrip from which the airbus left was the same airstrip used by military aircraft. Vincennes tried to determine “friend or foe” using electronic boxes that are in military planes. These warnings were sent out on civilian and military channels with no response from the airbus. The airbus flight path was over the Vincennes and during the flight path the crew of the Vincennes interpreted the airbus to be descending. The resulting decision was for the Vincennes to launch two missiles resulting in the destruction of the airbus and all passengers.


Cue Analysis

To understand the human factors at play, it is important to understand the cues that were perceived and how these cues impacted the decision-making process. It was noted that the radio was using the “friend or foe” on the military channel because the operator had earlier been challenging an Iranian military aircraft on the ground at Bandar Abbas, and had overlooked to change the range setting on his equipment. Flight 655 was actually transmitting IFF Mode III, which is the code for a civilian flight. Although the warning was broadcast from the USS Vincennes and the cue received by the airbus crew, the civilian frequency warning was ignored. The airbus was interacting with air traffic control and likely did not suspect that the military channel warning was intended for them.


Noise and activity level impacted the auditory cues as some officers thought the airbus was commercial, while others thought it was a fighter jet, but only the first information (fighter jet) was “heard.”


Heuristics Analysis

Several heuristics impacted the decision that was made to launch the two missiles.



  1. Availability heuristic: The surprise attack on USS Stark the previous year may have impacted the decisions made. The Stark incident became the best available exemplar; every approaching radar trace tended to indicate a potential surprise attack.
  2. Representativeness heuristic: The attack was similar to the USS Stark as it had not responded to warnings and flew at the boat.
  3. Expectation: Captain Rogers was aware of the rumors that the Iranians were planning some sort of token Independence Day attack. Here, too, every approaching radar trace became the target of intense suspicion.
  4. Cognitive framing: On repeated occasions, Flight 655 was reported as a descending warplane, when Vincennes’ own data tapes indicate that it was transmitting the correct Mode III signal continuously, steered a straight course for Abu Dhabi (its destination on the opposite shores of the Persian Gulf), and was climbing steadily throughout the proceedings.

Case Study Summary

In understanding the events that led up to this event, it is easy to determine that they “should have known better” or use hindsight bias. The presence of information that indicated a potential attack, under stressful environmental and psychological conditions coupled with limited time for decision making, resulted in the use of heuristics and subsequent decision errors.


14.8 Human Factors Contribution to Incidents in the Pipeline Industry


Could human factors impact the pipeline industry? It has in the past. From incident reports on the National Transportation Safety Board (NTSB) website (http://www.ntsb.gov/investigations), the following examples were found:



  1. Natural gas transmission pipeline rupture and fire in Cleburne, TX, on 7 June 2010.

    The NTSB determined that the probable cause of the rupture and fire was a contractor’s puncturing the unmarked, underground natural gas pipeline with a power auger. Contributing factors were the lack of permanent markers along the pipeline and failure of the pipeline locator to locate and mark the pipeline before the installation of a utility pole in the pipeline right-of-way.


    From a human factors perspective, the presence of visual cues to make decisions is critical. Unmarked pipelines represent a lack of visual cues, which contributed to an incident.


  2. Natural gas transmission pipeline rupture and fire in Palm City, FL, on 4 May 2009.

    The NTSB determined that the probable cause of the accident was environmentally assisted cracking under a disbonded polyethylene coating that remained undetected by the integrity management program. Contributing to the accident was the failure to include the pipe section that ruptured in their integrity management program. Contributing to the prolonged gas release was the pipeline controller’s inability to detect the rupture because of SCADA system limitations and the configuration of the pipeline.


    From a human factors perspective, the ability of the operator to make a decision about the presence of a leak was impacted by the visual cues provided by the SCADA system.


  3. Hazardous liquid pipeline rupture in Marshall, MI, on 25 July 2010.

    The NTSB determined that the probable cause of the pipeline rupture was corrosion fatigue cracks that grew and coalesced from crack and corrosion defects under disbonded polyethylene tape coating, producing a substantial crude oil release that went undetected by the control center for over 17 h. The rupture and prolonged release were made possible by pervasive organizational failures that included the following:



    • Deficient integrity management procedures, which allowed well-documented crack defects in corroded areas to propagate until the pipeline failed.
    • Inadequate training of control center personnel, which allowed the rupture to remain undetected for 17 h and through two start-ups of the pipeline.
    • Insufficient public awareness and education, which allowed the release to continue for nearly 14 h after the first notification of an odor to local emergency response agencies.

    From a human factors perspective, following a human factors life cycle approach can minimize the potential for human error and decision-making flaws. Additionally, the ability of an operator to make decision based on visual cues is highlighted. This incident also illustrates the need to consider human factor interfaces external to an organization, including the public and emergency responders.


  4. Natural gas transmission line rupture in San Bruno, CA, on 9 September 2010.

    The NTSB determined that the probable cause of the accident was inadequate quality assurance and quality control in 1956 during a line relocation project, which resulted in the installation of a substandard, poorly welded pipe section with a visible seam weld flaw. Over time, the flaw grew to a critical size, causing the pipeline to rupture during a pressure increase stemming from poorly planned electrical work; an inadequate pipeline integrity management program failed to detect and repair the defective pipe section.


    Contributing to the severity of the accident were the lack of either automatic shutoff valves or remote control valves on the line and flawed emergency response procedures and delay in isolating the rupture to stop the flow of gas.


    Similar to the previous example, following a human factors life cycle approach can minimize the potential for human error and decision-making flaws. In addition to integrity management, the ability to make decisions under emergency conditions, and the decision-making tools required to make optimized decisions, is highlighted.


14.9 Human Factors Life Cycle Revisited


Application of human factors in an organization using the life cycle approach may occur in various ways based on the size and complexity of an organization. Some common elements to consider in the development of a human factors management plan/program include the following:



  1. Design standards that consider the human element. Ensure that standards consider how a human will technically design a system as well as how a human will use the design in operation and maintenance. For example, a change management process that requires all changes to be reviewed by operations and maintenance stakeholders could reduce the potential for operability issues with an asset.
  2. Construction practices that consider where a human error can occur with specific controls to manage this potential. For example, implementing a maximum workday of 12 h could reduce the potential for fatigue-related human errors during excavation.
  3. Running a high-quality ILI tool prior to the operation of a pipeline. A baseline ILI run could identify construction incidents requiring an investigation.
  4. Procedures designed to minimize human error. For example, physically attaching a procedure that is infrequently used to a physical asset can reduce the potential of an operator error.
  5. Implementation of an observation program for procedures frequently accomplished as well as those infrequently accomplished. Observation programs can reduce the potential for operator error and create a supportive work culture.
  6. Use of technology to automate repetitive processes that are prone to human error, such as data entry. Automation of some processes may reduce the potential for copy/paste human errors, for example.
  7. Use of expert systems to reduce judgment in rule-based analyses [24]. For example, use of an expert system to evaluate ILI data could reduce the potential for judgment errors caused by heuristics.
  8. Training personnel on heuristics and biases. Training could reduce human error through awareness of heuristics and biases.
  9. Investigation of all incidents with incorrect operations or human error root causes to a greater depth of understanding. For example, investigation into why a human error occurred (e.g., state of busy, too many tasks to accomplish in required time, complacency, focusing on the wrong hazard) could result in the reduction of human error through analysis of workload, environmental factors, current design, and presence of stressors.
  10. Training program designed for decision-making complexity. For example, providing advanced knowledge training for nonroutine activities reduces the potential for a judgment error. Automation opportunities exist for routine activities that are less complex in nature.
  11. Evaluation of the workforce for skills, expertise, and experience. Understanding the current capabilities of resources, both internal and external to the organization, can ensure that known knowledge and skill levels match requirements. For example, development of a certification program with assessment protocols can assure that a workforce has required competency levels and reduces the potential for human error due to the lack of training or inadequate qualifications.
  12. Experience and knowledge transfer program, such as pairing a junior employee with an experienced mentor. This type of program can reduce the potential of a human error during nonroutine tasks.
  13. Incorporate current standards such as control room management.
  14. Learn from other industries, such as Aviation, Nuclear, and Medicine. For example, consider the relevance of the Aviation Industry “Dirty Dozen” to an organization https://www.faasafety.gov/files/gslac/library/documents/2012/Nov/71574/DirtyDozenWeb3.pdf

14.10 Tools and Methods


There are many tools and techniques available to support an organization to integrate human factors into various areas of the business. While not exhaustive, the list below provides examples that can be considered.



  1. System design

    1. SHEL model: a human factors framework model that divides human factors issues into the components: software, hardware, environment, and liveware (people) [4].
    2. V-model: a systems engineering process that identifies interactions between the human in the system and the components of the systems [26].
    3. 5M model: a troubleshooting model that considers the components of man, machine, medium, mission, and management [27].

  2. Incident investigation

    1. Human reliability assessment (HRA): a means of evaluating the reliability of an individual or the probability of human error that can be assigned either postincident, during design, or during operation [28].
    2. Human factors investigation tool (HFIT): a means of evaluating the human and organizational factors that contribute to an incident [29].
    3. Human factors analysis and classification system (HFACS): a framework to assist in the investigation of human error using four levels of failure—unsafe acts, preconditions for unsafe acts, unsafe supervision, and organizational influences [30].

  3. Task analysis

    1. Job safety analysis: a procedure that integrates safety principles into work practices by breaking down the job into a sequence of steps, identifying possible hazards, and determining preventive measures for these hazards [31].
    2. Job task analysis: a means of evaluating a task through the evaluation of skills and demands required for task performance [32].
    3. Hierarchical task analysis (HTA): a task analysis that considers the goals of a system and how these goals are achieved or should be achieved [33].
    4. Cognitive work analysis (CWA): a framework for evaluating complex sociotechnical systems based on constraints [33].

  4. Ergonomic/physiological assessments

    1. Rapid entire body assessment (REBA): an ergonomic assessment focused on the evaluation of musculoskeletal disorders that may be due to workplace job tasks, through scoring the risks associated with the neck, trunk, legs, arms, and wrists [34].
    2. NIOSH lifting guidelines: guidelines developed to assist organizations in the creation of a safe work environment when lifting of loads is required [35].
    3. Rapid upper limb assessment (RULA): an ergonomic assessment focused on the evaluation of upper extremity musculoskeletal disorders that may be due to workplace job tasks [36].

  5. Decision making

    1. Pugh matrix analysis [37].
    2. Joint cognitive system [38].

14.11 Summary


In the oil and gas pipeline industry, everything we do requires some level of human intervention, from designing an asset, to turning a valve, to operating the pipeline, to analyzing data related to pipe condition. Understanding as an organization where these human interventions occur and ensuring some level of evaluation of the potential for human error in decision making is critical to the reduction in the frequency and/or magnitude of errors associated with human factors. An understanding of human factors, including what people see, what they hear, what they interpret, and subsequently how they respond can help an engineer to design systems and processes that reduce the potential for human error. The human factors life cycle approach illustrated in this chapter is an important tool in the journey toward holistic design, construction, operation, maintenance, and decommissioning of reliable systems.


References



  1. 1 Feyer, A. and Williamson, A. (2011) Human factors in accident modelling. In: Stellman, J.M. (editor), Encyclopedia of Occupational Health and Safety, International Labor Organization, Geneva.
  2. 2 Wickens, C., Lee, J., Liu, Y., and Becker, S. (2004) An Introduction to Human Factors Engineering, 2nd ed., Pearson Education Inc., New Jersey.
  3. 3 Pew, R.W. and Mavor, A.S. (editors) (2007). Human-system Integration in the System Development Process: A New Look, National Academies Press, Washington, DC.
  4. 4 Song, W., Li, J, Li, H, and Ming, X. (2020) Human factors risk assessment: an integrated method for improving safety in clinical use of medical devices. Applied Soft Computing, 86, 105918.
  5. 5 Miller, G. (1956) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological Review, 63(2), 81–97.
  6. 6 Tulving, E. (1972) Episodic and Semantic Memory, Academic Press, New York.
  7. 7 Fiedler, F.E. and Garcia, J.E. (1987) New Approaches to Leadership: Cognitive Resources and Organizational Performance, John Wiley & Sons, Inc., New York.
  8. 8 Woods, D. and Cooke, R. (1999) Perspectives on human error: hindsight biases and local rationality. In: Durso, F. (editor), Handbook of Applied Cognition, John Wiley & Sons, Inc., New York, p. 5.
  9. 9 Svenson, O. and Maule, A.J. (editors), (1993) Time Pressure and Stress in Human Judgment and Decision Making, Plenum Press, New York.
  10. 10 Pipeline and Hazardous Materials Safety Administration (2009) 49 CFR Parts 192 and 195. Pipeline Safety: Control Room Management/Human Factors, Office of the Federal Register.
  11. 11 Refinetti, R. (2010) Circadian Physiology, 2nd ed., CRC Press, Boca Raton, FL.
  12. 12 British Standards Institution (1987) BS 6841: Measurement and Evaluation of Human Exposure to Whole-Body Mechanical Vibration, British Standards Institution, London.
  13. 13 Pilcher, J., Nadler, E., and Busch, C. (2002) Effects of hot and cold temperature exposure on performance: a meta-analytic review. Ergonomics, 54(10), 682–698.
  14. 14 Wyon, D. (2004) The effects of indoor air quality on performance and productivity. Indoor Air, 14(7), 92–101.
  15. 15 Fishhoff, B. (1982) Debiasing. In: Kahneman, D., Slovic, P., and Tversky, A. (editors), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, New York, pp. 3–22.
  16. 16 Simon, H.A. (1957) Models of Man, John Wiley & Sons, Inc., New York.
  17. 17 Tversky, A. and Kahneman, D. (1974) Judgment under uncertainty: heuristics and biases. Science, 185(4157), 1124–1131.
  18. 18 Kahneman, D., Slovic, P., and Tversky, A. (1982) Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, New York.
  19. 19 Simon, H.A. (1956) Rational choice and the structure of the environment. Psychological Review, 63(2), 129–138.
  20. 20 Anderson, J. (1990) Cognitive Psychology and Its Implications, 3rd ed., W.H. Freeman, New York.
  21. 21 Cook, R. and Woods, D. (1994) Operating at the sharp end: the complexity of human error. In: Bogner, M.S. (editor), Human Error in Medicine, Erlbaum, Hillsdale, NJ.
  22. 22 Einhorn, H. and Hogarth, R. (1978) Confidence in judgment: persistence of the illusion of validity. Psychological Review, 85, 395–416.
  23. 23 Kahneman, D. and Tversky, A. (1984) Choices, values and frames. American Psychologist, 39, 341–350.
  24. 24 Zachary, W. (1988) Decision support systems: designing to extend the cognitive limits. In: Helander, M. (editor), Handbook of Human–Computer Interaction, North-Holland, Amsterdam, pp. 1235–1258.
  25. 25 Fogarty, W. (1988) Formal Investigation into the Circumstances Surrounding the Downing of a Commercial Airliner by the USS Vincennes, USN, 28.
  26. 26 Tittle, J., Peffer, J., Szymczak, S., and Grossman, J. (2008). Integrating cognitive systems engineering throughout the systems engineering process. Journal of Cognitive Engineering and Decision Making, 2, 249–273. https://doi.org/10.1518/155534308X377108.
  27. 27 Ahsan, M.M., Tariq, U., Asad, N., and Zaidi, A. (2018) Analysis of aviation accidents using 5M model. Doctoral dissertation, University of Management & Technology.
  28. 28 Health and Safety Executive (HSE) (2009) RR679: Review of Human Reliability Assessment Methods, Health and Safety Laboratory,Derbyshire, UK.
  29. 29 Gordon, R., Flin, R., and Mearns, K. (2005) Designing and evaluating a human factors investigation tool (HFIT) for accident analysis. Safety Science, 43, 147–171.
  30. 30 Shappell, S.A., and Wiegmann, D.A. (2000) The human factors analysis and classification system—HFACS. Available at https://commons.erau.edu/publication/737. (last accessed January 15, 2023).
  31. 31 Albrechtsen, E., Solberg, I., and Svensli, E. (2019) The application and benefits of job safety analysis. Safety Science, 113, 425–437.
  32. 32 Wolever, R.Q., Jordan, M., Lawson, K., and Moore, M. (2016). Advancing a new evidence-based professional in health care: job task analysis for health and wellness coaches. BMC Health Services Research, 16, 205.
  33. 33 Lintern, G. (2013). Cognitive work analysis, cognitive systems design. Available at http://www.cognitivesystemsdesign.net/Tutorials/CWA%20Tutorial.pdf. (last accessed May 30, 2022).
  34. 34 Ergonomics Plus Inc. (nd) Rapid entire body assessment (REBA). Available at www.ergo-plus.com/wp-content/uploads/REBA-A-Step-by-Step-Guide.pdf. (last accessed January 19, 2023).
  35. 35 California Department of Industrial Relations (2007) Ergonomic guidelines for manual material handling. Available at https://cdc.gov/niosh/docs/2007-131/pdfs/2007-131.pdf. (last accessed January 19, 2023).
  36. 36 Ergonomics Plus Inc. (nd) Rapid upper limb assessment (RULA).Available at www.ergo-plus.com/wp-content/uploads/RULA-A-Step-by-Step-Guide.pdf. (last accessed January 19, 2023).
  37. 37 Cervone, H.F. (2009) Using pugh matrix analysis in complex decision-making situations. OCLC Systems and Services, 25(4), 228–232.
  38. 38 Endsley, M.R., Hoffman, R., Kaber, D., and Roth, E. (2007) Cognitive engineering and decision making: an overview and future course. Journal of Cognitive Engineering and Decision Making, 1(1), 1–21.

Bibliography


Human factors information is available in a number of journals, including Human Factors, Human Factors and Ergonomics in Manufacturing, and Human–Computer Interaction. Some articles that may be of interest to readers include the following:



  1. Chapanis, A. and Moulden, J.V. (1990) Short-term memory for numbers. Human Factors, 32, 123–137.
  2. Craik, F. and Lockhart, R. (1972) Levels of processing: a framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671–684.
  3. CUPE (2002) Enough Overwork: Taking Action on Workload, Canadian Union of Public Employees, Ottawa, Ontario.
  4. Enander, A. (1989) Effects of thermal stress on human performance. Scandinavian Journal of Work, Environment and Health, 15(Suppl. 1), 27–33.
  5. Haselton, M., Nettle, D., and Andrews, P.W. (2005) The evolution of cognitive bias. In: Buss, D.M. (editor), The Handbook of Evolutionary Psychology, John Wiley & Sons, Inc., Hoboken, NJ, pp. 724–746.
  6. ISO (1997) ISO 2631-1: Mechanical Vibration and Shock—Evaluation of Human Exposure to Whole-Body Vibration. International Organization for Standardization.
  7. Campbell, P., Miller, M., Witt, E. (2015) NASA/SP–2015-3709 Human Systems Integration (HSI) Practitioner’s Guide, NASA Center for AeroSpace Information, Hanover, MD.
  8. OR-OSHA 103: Conducting a Job Hazard Analysis (JHA), Oregon Occupational Safety and Health Administration.
  9. OSHA (2002) Job Hazard Analysis, U.S. Occupational Safety and Health Administration 3071.
  10. Peterson, L. and Peterson, M. (1959) Short-term retention of individual verbal items. Journal of Experimental Psychology, 58, 193–198.

There are a number of books and articles in addition to those identified in the reference section that may be useful in furthering your understanding of human factors, including the following:



  1. Attwood, D., Deeb, J., and Danz-Reece, M. (2004) Ergonomic Solutions for the Process Industries, Gulf Publishing (Elsevier), Burlington, MA.
  2. Broadbent, D. (1958) Perception and Communications, Pergamon Press, London.
  3. CCPS (2007) Human Factors Methods for Improving Performance in the Process Industries, Center for Chemical Process Safety, John Wiley & Sons, Inc., Hoboken, NJ.
  4. CCPS (2010) A Practical Approach to Hazard Identification for Operations and Maintenance Workers, Center for Chemical Process Safety, John Wiley & Sons, Inc., Hoboken, NJ.
  5. Davies, D. and Jones, D. (1982) Hearing and noise. In: Singleton, W. (editor), The Body at Work, Cambridge University Press, New York.
  6. Eastman-Kodak (1983) Kodak’s Ergonomic Design for People at Work, 2nd ed., Van Nostrand Reinhold, New York.
  7. Endsley, M.R. (2017) From here to autonomy: Lessons learned from human-automation research. Human Factors, 59(1), 5–27.
  8. Grandjean, E. (1988) Fitting the Task to the Man: A Textbook of Occupational Ergonomics, 4th ed., Taylor & Francis.
  9. LaBerge, D. (1995) Attentional Processing: The Brain’s Art of Mindfulness, Harvard University Press, Cambridge, MA.
  10. Proctor, R.W. and Van Zandt, T. (2008) Human Factors in Simple and Complex Systems, 2nd ed., CRC Press, Taylor & Francis Group, New York.
  11. Roughton, J. and Crutchfield, N. (2008) Job Hazard Analysis: A Guide for Compliance and Beyond, Elsevier, New York.
  12. Sanders, M.S. and McCormick, E.J. (1993) Human Factors in Engineering and Design, 7th ed., McGraw-Hill, New York.
  13. Stanton, N., Baber, C., and Young, M. (2004) Observation. In: Stanton, N., Hodge, A., Brookhuis, K., Salas, E., and Hendrick, H. (editors), Handbook of Human Factors and Ergonomic Methods, CRC Press, New York.
  14. Sutcliffe, A.G. (2003) Mapping the design space for socio-cognitive task design. In: Hollnagel, E. (editor), Handbook of Cognitiver Task Design, Lawrence Erlbaum Associates, Mahwah, NJ, pp. 549–575.
  15. Vicente, K. (2004) The Human Factor: Revolutionizing the Way We Live with Technology, Vintage Canada, Toronto.
  16. Wickens, C. and McCarley, J. (2008) Applied Attention Theory, CRC Press, New York.
  17. Woodson, W., Tillman, B., and Tillman, P. (1992) Human Factors Design Handbook, 2nd ed., McGraw-Hill, New York.

May 10, 2025 | Posted by in General Engineer | Comments Off on Human Factors
Premium Wordpress Themes by UFO Themes