Peter Furst | December 1, 2022
In performing their work, employees generally do things that make sense to them at that particular point in time and place. The decisions to do something are usually based on their need, want, goals, mood, capability, understanding of the situation, and focus of attention, to name a few. Sometimes while performing their work, a mistake or an accident occurs.
To truly understand why and how things unfolded, we need to have an "inside perspective." That means we need to try to see things as the persons involved saw, felt, and understood things that led to the decisions they made and actions they took, resulting in an unplanned or unexpected outcome. This requires an understanding of the "local rationality principle."
Based on available data, as an industry, construction tends to have more accidents and injuries per man-hour worked than many others. To reduce these undesirable events, construction organizations must do a better job of understanding and managing the risks associated with the work. Looking at the domino effect theory of accident causation, one finds five basic sequential elements.
This theory was proposed by Herbert W. Heinrich to explain the sequential steps leading up to the resulting injury, with the understanding that removing any one of the steps will stop the accident and resulting injury from occurring.
Typically, an accident investigation is conducted after the fact to determine the what, where, when, and how things progressed so as to find a way to keep it from occurring again. This process tends to put the primary focus on the worker and their action or behavior.
Generally, most construction accident investigations are completed shortly after the event and usually attribute the event to a shortcoming on the worker's part. Some of the more common reasons given for the accident may include the following.
Another thing that may be considered is that other workers in similar situations have successfully completed similar tasks. So, this worker could have acted differently and not caused the accident. The perceived reasons may be that the worker is inexperienced, is incompetent, is incapable, is inattentive, lacks knowledge, has poor judgment, is complacent, or may have forgotten or failed to use the applicable safety standards. As a result of such thinking, human error or inappropriate action becomes the starting point of the investigation process.
As the saying goes, the only certainties in life are death and taxes; to this, one can safely add human error! The good news is that most of these errors cause no harm or have insignificant negative outcomes. Human error must be understood and explained to be dealt with. At the minimum, errors are either intentional or unintentional, per the flowchart.
A more detailed view of human error (i.e., slips, lapses, mistakes, and violations) provides a greater definition of the type of error made. A further breakdown of mistakes (rule- or knowledge-based) provides greater clarity to causal factors.
Such an approach to human error (what the worker did or failed to do) identifies some human functions as the reason why unintended or unanticipated negative outcomes occur. This provides information that could enable the organization to devise interventions to reduce the potential of workers performing their work with fewer adverse or negative effects. This deals with the fact that improper human acts can possibly lead to human errors.
The ultimate determination of the cause of the accident or error is generally based on what we know after the fact rather than what the person knew, thought, or felt at the exact time and place of the event. The event may be triggered by some simple factors or events, but the analysis is made using complex tools and techniques. Decisions made before the event's occurrence may have been made in a matter of seconds with limited information, while the evaluation made at a later point has the benefit of unlimited time and a plethora of facts resulting from hindsight. Basically, the system is predisposed to ultimately focus on the worker as being the cause of, or contributor to, the unplanned and unfortunate event.
Assuming the worker is a rational person who wants to stay employed and certainly does not want to get injured while performing their work, then the fundamental question should be what caused the worker to make the decision and then engage in the action that led to the unexpected, unintended, and unwanted event. The worker's primary objective is to perform their work and meet production expectations in order to stay employed. So, all decisions made are ultimately the result of trying to meet the above-stated goals as well as the situational factors encountered that exited at that particular point in time and place.
To perform their work, employees utilize the existing operational systems (e.g., processes, plans, policies, protocols, procedures, and accepted practices, to name a few). These have to be fully integrated and aligned to enable the workers to perform their work properly, effectively, and efficiently. This creates some complexity, which is further complicated by situational factors. This complexity is further aggravated by small deviations or failures in the subsystems mentioned above. Each is necessary, but only jointly are they sufficient to produce the accident, injury, and/or loss.
The workers at the "sharp" end (where the actual work is performed) have to function within the work environment and effectively deal with the residual risks. Some of these risks are tolerable (fall within an acceptable envelope), some are manageable (the worker can deal with or compensate for them), and the rest may cause process failure (if the worker is not able to effectively manage the risks involved, there are negative consequences). These process failures impact production, quality, safety, customer relations, profits, and any other target or standard organization used to measure performance.
The operational as well as organizational systems have subsystems. The subsystems interact and influence each other. These system interactions may create discrepancies, which are deviations from design or intent. People work with these systems, and as a result, they may impact the workings of these systems, which in turn may create additional discrepancies. Some of the discrepancies may compensate for other discrepancies, thereby reducing their negative impact, while others may have an accumulating effect and increase the negative impact. The combined discrepancies create risk within the work environment, per the spiral chart.
Discrepancies may arise when people take shortcuts to meet difficult performance goals, and these shortcuts may enable goal achievement with possibly no apparent negative outcome. This may be due to other remaining layers of protection. An employee faced with another challenging or difficult production goal may revert to a previously implemented "successful" shortcut. Eventually, the shortcut may become the "accepted" means for performance and displace the actual prescribed company means or method. The key difference is that some of the inherent layers of protection have unknowingly been removed (eliminated) and the inherent risk has been increased.
So, to effectively address these organizational risks to garner improvement, one has to have a robust understanding of the systems, the people, and their operating interrelationships. Performance improvement can be addressed at any level in any area. The effort expended at the sharp end (task/work) will take less effort and will provide limited improvement, which, in all likelihood, will also be unsustainable. Greater effort will be required as we go deeper into the workings of the organization, and, of course, the gains will be greater and will be sustainable to a much greater degree.
In general, the more rational approach would seem to lie in focusing on fixing the system and spending less time and effort trying to change people's behavior, and this then gives it an operational focus rather than a worker focus. This undertaking looks at the inner workings of the processes and how they create discrepancies, which in turn creates risk. It also looks at how this impacts the workers functioning within the work system context for a solution to the injury problem rather than revising safety programs or providing more training that tries to deal with workers who have accidents or are caught working unsafely. (A notable quote from Albert Einstein: "We cannot solve our problems with the same thinking we used when we created them.")
It is a widespread belief that some form of worker failure is the primary cause of occupational accidents and their resulting injuries. These failures undermine the operational systems that otherwise perform as designed or as intended. Historically, the above episodes encapsulate current, widespread beliefs in many technical and professional communities and the public in general about the nature of human error and how systems fail. In most domains today, from aviation to industrial processes to transportation systems to medicine, when systems fail, we find the same pattern as was observed in earlier cases.
It is evident that rational people do things that make sense to them at the point in time that they make decisions related to what they were engaged in. Decision-making is a cognitive process and may include such things as task goals, available information, any perceived expectations of supervision associated with the given task, the person's focus and attention, or the general understanding of the overall situation (work climate and organizational culture), to name a few. Obviously, rational people would not do things that did not make sense to them. For people who want to better understand this concept, they should read The Field Guide to Understanding Human Error, written by Sidney Dekker. An important concept that is fully discussed in this book is that of local rationality, per the flowchart.
Mr. Dekker contends that local rationality explains the idea that during the events leading up to accidents, or the aftermath of an unwanted negative event, people are acting in a way that makes sense to them, given their mindset, situational context, past experience, knowledge, capability, readiness, availability of relevant information, pressures they are under, task-related goals, objectives, agendas, and politics, as well as the task's demand and design. By understanding local rationality, people investigating unplanned, unexpected, or undesired negative outcomes get a much better picture of "why" things happened the way they did leading to the outcomes they ended up with.
This involves the rational choice theory, which is a framework for understanding social or economic actions or behavior. The theory assumes that the people involved have perfect situational information, have a clear understanding of the goals and objectives, are secure in their relationship with management, and possess the knowledge, capability, judgment, authority, and motivation to evaluate the information, make the appropriate assessment, and arrive at the best decision to further the accomplishment of the task.
An important and possibly critical consideration underlying decision-making involves cognitive factors. Different aspects of cognition will be relevant to different conditions and/or situations. At work, people pursue task goals that involve taking action. To do so, they must assess conditions, understand situations, and make quick decisions based on the immediately available information and then take the appropriate and necessary action within a relatively and potentially very short period of time.
Decision-making requires evaluative judgment. So, the immediate situational information is evaluated based on a review of comparative information available to people—the cumulative mentally stored data that comes from life experiences. These experiences are unique to each and every individual. Therefore, a seemingly rational choice is made based on this comparative analysis. The process elements are the limited situational information assessed against the stored mental data, which are affected by task performance expectations, the person's perceptions of their workplace climate, their relationship with supervision and possibly others, and their aspirations and values, which are bounded by their local reality and/or rationality.
People's rationality is local by default. What is interesting is that people have one set of standards for themselves and different ones for others. We tend to look for outside causal factors when we have adverse outcomes (accidents or failures), but we tend to attribute other people's adverse outcomes to their internal factors. If we are involved in an accident, we look for a reason in the environment that facilitated or caused it—someone or something else that caused or contributed to whatever went wrong. But when we evaluate another person's accident, we generally find a reason that is internal to them—they forgot, made a mistake, did not pay attention, did not use common sense, did not follow proper procedure, etc.
It is very important to understand that safety improvement interventions are typically driven by historical data, such as cumulative results of jobsite safety inspections or the conclusions of causation in accident investigations. Decisions are affected by cognition and situational factors driven by local reality or rationality, which plays a significant role in the identification of potential problems that result in the deployment of corrective actions based on those findings.
One way to avoid falling into the traditional means of the attribution of causation of the unsafe act is for the safety practitioner to try to clearly understand the information available to the worker, the situation or conditions the worker found themselves in, and the worker's understanding of the task requirements. It is important to look at the worker's perception of the production needs as well as the work climate, which all were more than likely factored into the worker's decision-making process, leading to the unsafe act and resulting accident.
Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI. Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion. If such advice is needed, consult with your attorney, accountant, or other qualified adviser.