Summaries and study assistance with A Life in Error, From Little Slips to Big Disasters by Reason – Booktool

  Tools

De hele tekst op deze pagina lezen? Alle JoHo tools gebruiken? Sluit je dan aan bij JoHo en log in!
 

Aansluiten bij JoHo als abonnee of donateur

The world of JoHo footer met landenkaart

Summaries and study assistance with A Life in Error, From Little Slips to Big Disasters by Reason

Booksummaries

JoHo: crossroads via bundels
JoHo worldsupporter.org: gerelateerde samenvattingen en studiehulp

Booksummary per chapter

Summaries per chapter with the 1st edition of A Life in Error, From Little Slips to Big Disasters by Reason - Bundle

Summaries per chapter with the 1st edition of A Life in Error, From Little Slips to Big Disasters by Reason - Bundle

Study guide with A Life in Error: From Little Slips to Big Disasters by Reason

Study guide with A Life in Error: From Little Slips to Big Disasters by Reason

Study guide with A Life in Error: From Little Slips to Big Disasters

Online summaries and study assistance with the 1st edition of A Life in Error: From Little Slips to Big Disasters by Reason

Related content on joho.org

What are the nature and varieties of human error? - Chapter 1

What are the nature and varieties of human error? - Chapter 1


What are absent-minded errors?

On an afternoon in 1970 James Reason was boiling some water for a cup of tea, while his teapot was waiting for the water and for the tealeaves. His cat was meowing for his food, and because Reason was a little scared of his cat he decided to feed the cat first. But instead of spooning the cat food in the cat’s bowl, he spooned the food into his teapot.

This slip had a number of interesting properties. Giving the cat his food and setting some tea are both highly automatic, habitual sequences that he did in a very familiar environment (his home). The meowing of the cat did not usually occur, this captured the attention and lead to the slip. This slip seemed to have nearly all the principal characteristics of absent-minded errors:

  • Both acts (tea, food) were highly routine, so attention was absent.
  • Both the cat’s bowl and the teapot afforded containment.
  • There was a change in the routine sequence (meowing) that misdirected the action.

Another example for this kind of slips was when Reason observed his wife making tea. His wife was boiling water and had the teapot open waiting for the water and the tealeaves. But instead of getting tealeaves she reached for the jar of coffee, put some spoons into the pot and added water. Only when the strong smell of coffee came, she noticed the slip. The thing of importance is the way she put the coffee into the pot. James explained that the tealeaves have a pull-off lit and the coffee has to be unscrewed. He came to the conclusion that well-used familiar objects develop a local control zone. This means that when your hand enters the zone (like close to the fruit bowl) is automatically performs an object-appropriate action (pick up an apple).

Aimless periods, like passing the time waiting on a phone call, show us that a lot of our behaviour is under the control of the environment, which in turn leads to unintended actions.

What are strong-habit-intrusions?

Strong-habit-intrusions are actions that are under-specified (due to forgetting, reduced intention etc.), people tend to lean on the actions that usually occur in that particular context. These situations are called strong-habit-intrusions.

The tip-of-the-tongue phenomenon is a state where strong-habit-intrusions occur. We struggle to retrieve the word that we mean, but we can recall all the words that seem similar to it (blockers).

Plans lead to Actions and Consequences - Chapter 2

Plans lead to Actions and Consequences - Chapter 2

The term error must be dispensed. In this chapter you will get a working definition that will work as a framework.

Errors are not fundamentally bad. It is often the case that the circumstances of their occurrence shape the consequences of the error. For example, switching on the light in the kitchen instead of the light in the hallway will not have very bad consequences, but switching on the wrong switch when you’re flying an airplane can be disastrous. Reason also proposes that all human actions have three basic elements: plans, actions and consequences. Each of the elements will be explained below.


Plans

Plans are central to our understanding of errors. All humans plan, you probably know what you are going to do today, tomorrow, next week and maybe even next year. The more long-term your plans are, the more vague they get.

Keep this example in mind while we discover what planning entails: You are hungry, it has been a hard morning and you want to treat yourself. You consider different cuisines but decide on Italian, because you are feeling a unique pizza for lunch. Do you want to go to the Napoli or the Italia? You decide that the Napoli is easier to reach with the bus. The only problem is now that you are on a diet and that pizza is not low in calories.

The plan (having lunch somewhere) starts with a need to alleviate a state of tension (hungry and feelings about the hard morning). After the need there is the intention (having a lunch to treat yourself). This intention turns into a goal that can be achieved with different means. Having made a particular plan (having lunch at Napoli), you only have to assemble and specify the action sequences (how do you get to Napoli).

Can a planner help?

A planner does not have to plan every detail of an action; because a lot of what we do already has different mental and verbal tags that lead to largely automatic subroutines. The more we engage in these habitual actions, the less tags we need to specify our planning.

When you arrive safely at the Napoli and had a delicious pizza, what was the error in the plan? Well, you clearly did not hold you to your dietary plans, which was a long-term plan of yours. Because we make so many plans that interact with each other, it is hard to pinpoint the point where we make errors. In general there are two types of plans: co-existing plans and conflicting plans (the pizza lunch and the diet).

There are two built-in-limitations to human performance. The first is that we can’t always have the physical capacity to turn a personal plan into action; we just can’t be at two places at the same time. Secondly, we don’t have enough mental capacity to carry out all our plans.

For the purpose of this book an error is defined as: “The term error will be applied to all those occasions in which a planned sequence of mental or physical activities fails to achieve its desired goal without the intervention of some chance agency”. The two important things to remember from this definition is the notion of intention and the absence of a chance intervention.

There are two ways in which you can fail to achieve your desired objective:

  • Failures, such as slips and lapses (absent-mindedness) or trips and fumbles (clumsy or maladroit actions) occur at the level of executions. There is nothing wrong with the plan of action.
  • Failures that arise from the plan itself, this involves mistakes and are much more complex and harder to notice.
Skill-based, Rule-based and Knowledge-based performances - Chapter 3

Skill-based, Rule-based and Knowledge-based performances - Chapter 3

This chapter distinguishes three types of performance levels. Slips and mistakes arise from different mechanisms. Slips from failures in execution (often occurring because of automatic procedural routines also called action schemas) and mistakes from failures in planning. There is a problem in this categorization because some errors possess properties from both.


Which errors?

Two big errors that illustrate a combination of slips and mistakes

Oyster Creek: The annulus level and the water level are usually the same, but this time they weren’t. They mistook the annulus level for the water level, which was very close to the fuel elements. Even though an alarmed sound 3 minutes after the error, it was not discovered until 30 minutes later.

Three Mile Island: The operators did not notice that a switch of the pressurizer was open; the panel light showed that it was closed. They only looked at the panel and did not think about the switch being broken or stuck.

The wrong appraisal of the system state looks like the property of a mistake, and the selection of the strong-but-wrong interpretations are more close to a slip-like failure. These type of error can be categorized as an inappropriate diagnostic rule, specifically the If (situation X prevails) then (system state Y exists) kind.

Using the by Jens Rasmussen distinguished three performance levels; skill-based (SB), rule-based (RB) and knowledge-based (KB). Using these three levels, there are three distinct error types created: skill-based slips, rule-based mistakes, knowledge-based mistakes.

How can we distinguish the performance levels?

The core distinction is whether or not the actor was engaged in problem solving. Skill-Based (SB) level involves no awareness of the current problem and is completely automatic and habitual. Routine-Based (RB) and Knowledge-Based (KB) are only triggered when the actor is aware of the problem he/she is facing. When the problem has an accessible solution it works on the RB level, but when it is harder to form is work in the KB level. When the problem is unfamiliar it is usually dealt with by trail-and-error.

One feature of SB and RB mistakes it feed-forward control. When it is a feature of SB it is a performance based on feed-forward control and depends upon a very flexible and efficient dynamic internal world model. When it is a feature of RB it is a performance that is goal-oriented, but structured by feed-forward control through a stored problem- solving rule. In may look like: If (X, Y, and Z are present) then it is (a B situation) or If (A) then (do C).

Control at KB level is mostly through feedback. This is necessary because at the KB level one can’t rely on their mental schemas and habituations anymore. It can be achieved by setting local goals, initiating actions to achieve these goals and then reflect if the action was successful or not.

There is a relationship between the predictability of error and the level of expertise. The more skilled an individual is, the more likely it is that their errors will take the strong-but-wrong form on a SB or RB level.

Humans are better at solving problems than computers. That is because if we exhaust our problem-solving skills we search for analogies that can help us solving the current problem.

The best example of coexistence of these three levels

Driving is the best example of the coexistence of the three levels. For example steering, changing gears and speed control are operated on a skill-based level. Dealing with other road users is an example of a rule-based operation and the knowledge-based level only shows if you for example unexpectedly have to change your route because of work on the road.

Slips and Lapses when Absent-Minded - Chapter 4

Slips and Lapses when Absent-Minded - Chapter 4

We all experienced moments where our mind is just absent (shaving cream on your toothbrush, going in the wrong gender toilet etc) and we also all experienced moments where our mind just goes blank. Our lives are full of these kinds of incidents Freud called ‘refuse of the phenomenal world’.


What are Hallmarks of Absent-Minded (AM) slips?

The first hallmark of AM slips is that they are always recognizable, maybe they present themselves at the wrong place and time, but they are part of our personal repertoire of actions. Especially act-wait-act-wait tasks (like setting a cup of tea) are associated with slips. The slips that occur are not crazy (like throwing the cup through the window) but our inside our range of behavioural possibilities. Another situation where AM slips is not the right word, is when something does something for the first time. This novice error arises from lack of competence, whereas the first hallmark of an AM slip is misapplied competence. This means that AM slips, are problem of an expert, not of a novice. This may be difficult to comprehend because it would be more logic that an expert knows enough not to make mistakes. But it is not about the quantity of the mistake, what the type of mistake an expert makes. An expert makes fewer mistakes, but if they make a mistake it can be an AM slip.

The second hallmark of an AM slip is that they are not random events determined by the habits of a person. AM slips follow a clear discernible pattern that is for the most part independent of the period of time in which the person lives. This means that AM slips are timeless and universal human characteristics.

Little human errors (like his example with the cat food) are behavioural spoonerism. Spoonerism is a humorous mistake in which a speaker switches the first sounds of two of more words. A behavioural spoonerism would then be a humorous behavioural act caused by doing one thing, while you mend to do something else.

AM slips are a consequence of our humanism; it is the price we pay for being able to things automatically. Life would be insufferable if we were constantly present-minded.

What goes absent in AM slips?

Conscious concerns take a huge part of our lives, these concerns drain a piece of our attention resource. How much the concerns take form our resource depends on the nature and the intensity. All mental and physical activities demand attention, even if they seem to be completely automatic. But the more habitual the activity is, the smaller the demand.

Absent-minded errors occur when a large part of the attention resource is already used somewhere else. Schemas (knowledge structures in long-term memory) are activated independently of current intentions, so part of our attention resource is always busy with restraining these schemas to be activated in particular situations. These schemas are highly active and competitive, so our attention is largely claimed by suppression.

What are situations where strong habit intrusions occur?

Strong habit intrusions are well-organized sequences that recognizably belong to some activity other than the one that is currently intended. This activity shares similar locations, movements and objects with the intended action. More than 40% of the AM slips are these kind of intrusions.

Next to the type of activity as explained above, there are four other situations where strong habit intrusions can occur:

  • When a change of goals demands a departure from some well-established action sequence. For example, you are on a diet and you don’t want to put sugar in your oatmeal, but when having breakfast you automatically still put some sugar in it.
  • When changed local conditions require the modification of some oft-performed action sequence. For example, when reorganizing your closet it takes a long time to stop grapping for the place you previously had all your sweaters.
  • When a familiar environment associated with particular routines is entered in a state of reduced intentionally. For example, when waiting on a phone call you randomly brush your teeth not because it was necessary.
  • When features of the present environment contain elements are similar to those in highly familiar circumstances. For example, when you show your credit card at the entrance of the UB instead of your RUG card, even though you don’t have to pay for access, you behave like you’re in a supermarket.
Individual Differences - Chapter 5

Individual Differences - Chapter 5

This chapter is about the Short Inventory of Minor Lapses (SIML), as self-report that estimates an individuals’ proneness to AM slips and lapses.

It is a 15-item questionnaire, as they are listed below:

  • How often do you forget to say something you were going to mention?
  • How often do you have the feeling that you should be doing something, either now or later, but you can’t remember what it was?
  • How often do you find your mind continuing to dwell upon something that you would prefer not to think about?
  • How often do you can’t remember what you have just done or where you have just been?
  • How often do you leave some necessary step out of the task?
  • How often do you find that you can’t immediately recall the name of a familiar person, place or object?
  • How often do you think you’re paying attention to something when you’re actually not?
  • How often do you have the ‘what-am-I-here-for’ feeling when you find you have forgotten what you came to do?
  • How often do you find yourself repeating something you’ve already done or carrying out some unnecessary action?
  • How often do you find you’ve forgotten to do something you intended to do?
  • How often do you decide to do something and then find yourself side-tracked into doing something different?
  • How often do you find yourself searching for something that you’ve just put down or are still carrying around with you?
  • How often do you forget to do something that you were going to do after dealing with an unexpected interruption?
  • How often do you find your mind wandering when you are doing something that needs your concentration?
  • How often do you make errors in which you do the right actions but in relation to the wrong objects?

The most commonly occurring slips were: failing to recall a name, forgetting to say something, forgetting intentions and mind wandering.


How can age affect slips?

There are two explanations of the finding that self-reported incidents of minor slips and lapses diminished with age:

  • The Activity Hypothesis: Old people are less active, and therefore have fewer opportunities to make AM errors.
  • The Compensation Hypothesis: The first version of the hypothesis is that old people are aware of their diminished cognitive competence, and rely more on memory aids and reminders and therefor suffer less lapses. The second version (dread variant) is that older people might interpret slips and lapses as symptoms of dementia/Alzheimer and therefor spend more time and attention in their day-to-day performance.

This still doesn’t explain the big difference in results between ‘young’ (17-29) and ‘middle aged’ (30-49). Researches try to explain this by stating that same older people tend to ‘forget that they forgot’ or try to blame the difference on the way the questionnaire is build.

Can stress-vulnerability lead to cognitive failures?

A high level of cognitive failures and the number and degree of psychiatric symptoms experienced during or immediately following a period of real-life stress, is one of the most important findings of the questionnaire. Donald Broadbent argued that high rates of AM slips can lead to increased vulnerability to real-life stress.

To test this hypothesis, they gave woman who had breast cancer, the SIML and the Goldberg’s General Health Questionnaire. Results showed that the SIML correlated significantly with the number of psychiatric symptoms. They also found that depressed patients had higher SIML scores. On page 34 of the book (a life in error), you can find the list of all the predictors (e.g. radiotherapy, pain in scar).

The findings provide strong support for the stress-vulnerability hypothesis. One explanation for the underlying mechanism is that people differ characteristically in the way they develop. These differences lie in the way people cope with stress. For example, people who are error-prone use more resource-intensive strategies. It seems like stress is not the thing that induces the high rate of cognitive failure, but the way people deal with this stress.

SIML in the Courtroom - Chapter 6

SIML in the Courtroom - Chapter 6

 

This chapter again starts with an example: Mister X was charged for stealing in the supermarket. This was the second time he was charged with the offences. Both times Mister X bought all the items in the rear end of the trolley, but “steals” the items that are stored in the front end of the trolley. Mister X denied intend to steal and said that he simply overlooked the unpaid-items.


What are the factors of hallmarks of AM behaviour?

The behaviour of Mister X is indeed similar with AM behaviour. He found three factors that are all hallmarks of AM behaviour:

  • Familiarity (the supermarket was a familiar setting)
  • Preoccupation/Distraction (it was very hot that day and he already drank a can of juice during the shopping and the checkout was very meticulous to check if the can of juice was paid for)
  • Separation (the “stolen” items were in a separate section of the trolley)

To convince the jury of the absent-mindedness of Mister X, Mr. and Mrs. X were given the SIML. The scores were both very high. After this, 26 students got to complete the SIML while they had to keep in mind they were just convicted of theft, but had offered absent-mindedness as a plea. When comparing these to tests, they found that Mister X’s scores were not having the ‘faking factor’. All these findings caused for reasonable doubt in the courtroom.

When analysing letters wrote by 67 people, who were (according to them) wrongly accused for theft. They found that most of the critical lapses occurred when the shoplifter’s limited attention resource was occupied of engaged by another task (e.g. some people were in a divorce, some under medical treatment).

The New Freudian Slip - Chapter 7

The New Freudian Slip - Chapter 7

Let’s start with what Freud actually wrote about slips and lapses: ‘A suppression of a previous intention to say something is the indispensable condition for the occurrence of a slip of the tongue’. So, a slip is a product of both local opportunity and a struggle between two mental forces: some underlying need or wish and the desire to keep it hidden. Freud also thought that every mistake we make has a meaning, even things as absent-mindedness and distraction.

Classic Example from Freud

In Freud’s book he describes a situation where he met a young man. The man tried to quote a line from Virgil, but didn’t do it accurately. When the young man noticed that he quoted the wrong line he wanted to explain to Freud why he did it wrong. After Freud quoted the right line he asked by the young man had forgotten the word ‘aliquis’ (someone). The young man explained that he was distracted because the words ‘a’ (fluid) and ‘liquis’ (saint of the blood). He told Freud that he was worried that his girlfriend didn’t got her period on time. This is exactly what Freud meant, there was an explanation for the mistake. Other researches don’t agree with Freud though and tried to explain the slip in various ways.

Another example from Freud

Another example of Freud was a story that his friend William Stekel told him. Stekel was saying goodbye to a patient of him at a house call. The extended his hand to the woman and discovered that instead of shaking her hand he was loosening her dressing gown. Freud explained this mistake as if Stekel has unconscious desires for this woman, but other explanations can be that he was so used to undoing bows of jackets and dresses for medical examination.

What are the conditions for provoking a AM error?

With these examples in mind, there are two necessary conditions for provoking a absent-minded error:

  • Cognitive under-specification, inattention, incomplete sense of date or insufficient knowledge.
  • The existence of some locally appropriate response pattern that is primed by prior usage or activation.

Freudian slips do occur, of course in some cases there is a reason behind a mistake. But they do not occur frequently. Everyday slips and lapses have more banal origins. For a slip to be Freudian, they have to be less familiar than the intended word or action. So the definition of slips from Freud that slips represent minor eruptions of unconscious processing, it true. But we don’t use the psychoanalytic meaning of unconscious, it as seen as not directly accessible, automatic.

Failures in planning - Chapter 8

Failures in planning - Chapter 8

What are the components of planning process?

The mental activity of planning does not fit readily in the categories of Rasmussen: Skill-based (SB), Rule-based (RB), Knowledge-based (KB). Both the complexity and the degree of uncertainty about the future, is what determines where the activity fits. Common planning (like planning where to go to lunch) is more on the lines of SB. But this chapter focuses more on RB and KB.

There are three major components of the planning process: a limited attention resource, a set of mental operations that act upon this database and schemas. But input and output functions are also important for linking the planner to the world.

The Working Database is limited in capacity and also continuously variable in content. It contains three types of information:

  • Information which has been spontaneously thrown up by active schemas, which do not have to be plan-relevant.
  • Information which has been called up from the schema base.
  • Information that is derived directly from the environment (via input).

There are three inter-linked operations involved in planning (mental operations): selection, judgement and decision-making. The contents of the working database can be selected form schemas or from the immediate environment. Judgement can be of two kinds; those related to goal setting and those related to goal achievement. Lastly, decisions are made based on the goal and the actions that are needed to achieve that goal.

The planning process can be broken down in 4 stages:

  • Setting the objectives
  • Searching for alternative courses of action
  • Comparing and evaluating alternatives
  • Deciding upon the course of action

Schemas are involved in all stages and contribute to all kinds of information. The uncalled-for (spontaneous) information are more likely to include emotionally charged material, which can be activated by the situation or through outputs from recently used schemas. Three categories of sources of bias are categorized that can lead to planning failures, which are mentioned below.

Sources of Bias in the Working Database: It could be that the database will only show a small fraction of the information that is relevant for that situation. Another situation is that of the several variables, the database can only show 2 or 3 at the same time, but the planning process can’t be sustained for more than a few seconds and when it proceeds the content is probably different. Another type of bias is that planning is clouded by past experiences, which gives the variability of future events. Lastly, the information that is called into the database is biased in favour of past successes but also in favour of recently activated schemas.

Sources of Bias in Mental Operations: Planners are mostly guided by things in the past and less by chance, as result they will plan for less events than are likely to occur. Planners also give more attention to information that has emotional impact and are heavily influenced by their own theories. Predicting and assessing the population based on data and detecting many types of co-variations, are both things that planners are not very good at. Lastly, planners: will be subject to the halo effect: they have problems processing two separate orderings of the same people or objects, tend to have a simplistic view of causality and tend to be overconfident in evaluating the correctness of their knowledge.

Schematic Sources of Bias: This kind of bias shows itself after the planning has been completed but before it is executed. A completed plan is not only a set of directions for future actions but also a theory about the future state of the world. Because of this, it is unwilling to change when the more complex the plan gets. Dominating all these features is a strong urge to make sense of all the plan features, called effort after meaning.

What is collective planning and what are its failures?

As you might know, most of our plans are the product of organizations and groups, fittingly called collective planning. An influential theory of this type of planning is Behavioural Theory of the Firm by Herbert Simon, and is based on his principle of bounded rationality: The capacity of the human mind for formulating and solving complex problems is very small compared with the problem whose solution is required for objectively rational behaviour in the real world.

This principle gives us insight in the process where organizational planners have to compromise in their goal setting because the best possible outcomes aren’t always doable. This limitation is called satisficing and it is the tendency to select satisfactory rather than optimal courses of action.

Richard Cyert and James March did not agree with the rationality that was assumed by Herbert Simon. They proposed that there are four general rules of thumb/heuristics in the planning process:

  • Quasi-resolution of conflict
  • Avoidance of uncertainty
  • Problematic search
  • Selective organizational learning

Downs worked this out in more detail and created a theory that explains decision-making in large organizations and identified four self-serving biases that he found overlapped in all officials. Each official tends to change the information he passes up upwards in the hierarchy; mostly they exaggerate and make the data more favourable to them. Also the degree to which an official will seek out additional responsibilities and accept risks and the degree to which he complies with the directions from above, is dependent on how much they will help him achieve his personal goals or how much it favours his interest. Another thing an official is biased for is how much the policies and actions are in his own interests.

Climbers are officials that are strongly motivated to invent new functions for their departments to invent new functions for their departments, and to avoid economies. Conservers are biased against any changes in the status quo. Organizations are more likely to make planning failures than individuals, but there are some similarities in the underlying error tendencies: organizations also plan based on a limited database, organizations have biases because of the over-use of labour-saving heuristics and both show themselves to be prisoners of the past and routines.

This groupthink syndrome that he studied was characterized by 8 main symptoms:

  • An illusion of invulnerability, creating extreme optimism and taking more risks.
  • Collective efforts to rationalize away warnings that might have let to reconsideration of the plan.
  • Belief of the rightness of the group’s intentions.
  • Stereotyped perception of the opposition.
  • The exertion of group pressure if a group member deviates from the collective stereotypes, illusions or commitments.
  • Self-censorship of any doubt felt by individual members.
  • Shared illusion of unanimity.
  • The emergence of mind guards, members who see it as their job to protect the group from any contrary opinions or adverse information.

The powerful forces of the perceived togetherness from a group, makes the perceived possibility unthinkable, and if not unthinkable than unspeakable.

Violations - Chapter 9

Violations - Chapter 9

A close examination of the behaviour of the operators during the Chernobyl disaster showed two distinct types of an unsafe act. First there was an unintended slip at the outset of the experiment, the reactor was operating at a too low power. The second act was that de operators decided to keep going with the trail. This example caused for a change in the way there was thought about his enquiries. The first focus was on individual error markers with a purely cognitive information-processing orientation and now they shifted more to an orientation that incorporates motivational, social and institutional factors. Errors can arise from informational problems in either the mind or in the environment. Violations are deliberate acts; people break the rules but people mostly don't intend the bad outcomes. Violations are mostly based on motivational factors: beliefs, attitudes and norms.

Violation Types

There are four types of violations:

  • Corner-cutting or routine violations: Mostly committed to avoid unnecessary effort or to circumvent procedures. Most organizations have formal rules but also informal procedures (black book). The informal procedures are derived from skilled and experienced operators who recognize that formal rules are sometimes written without any expert knowledge. It would be best of the people who make the rules and the people who perform the procedures come together to create a workable set.
  • Trill-seeking or optimizing violations: We have a lot of needs, which are often conflicting. Driving gives us a great visual of these needs; you want to get from A to B, but while you’re driving you enjoy speeding, like cutting in and may even start tailgating. Men and young people are more likely to violate, and it seems that women are not impressed by it so they act as a restraint on driver violations.
  • Necessary violations: Many organizations as result of accidents continue to write additional procedures that aren’t necessary to get the job done. For the production of error is usually under-specification of the process, but with necessary violations it is over-specification. These violations to get to job done will become routine eventually.
  • Exceptional violations: This type of violation is likely to occur under exceptional conditions. Because they occur infrequently, the operators are less practiced and the procedures often sparse.

What are ‘Mental Economics’ of Violating?

Petra Klumb investigated the costs and benefits of non-compliance. Error was more likely to occur when the benefits outweigh the costs. Examples of perceived benefits: easier way of working, saves time, gets the job done, looks cool and meets a deadline. Perceived costs can be: a possible accidents, injuries, damage to assets, costly repair, disapproval of friends.

Sometimes violations seem as an easier way of working with immediate benefits, the costs are mostly remote. To change this perception it is important to not increase the costs of non-compliance, but to promote the benefits of compliance.

Organizational Accidents - Chapter 10

Organizational Accidents - Chapter 10

The years between 1976-1988 were marked by the years with the most major disasters worldwide. On page 73 of ‘A Life in Error’ there is a list of a few of these major accidents between ’76-’88.

Causes of the accidents

Although all these kind of disasters are different, they share at least these three characteristics:

  • Many of the contributing factors were present within the system before the actual catastrophe occurred.
  • All the systems had multiple defences, barriers and safeguards designed to prevent knows hazards.
  • The disasters occur because an unforeseen links of latent conditions (unsafe acts by humans and local triggers) that defeated the defences.

The Swiss Cheese Model (SC) is another way to explain these views. The system defences are represented as slices of cheese, they intervene between the operational hazards and potential losses. When it would be a perfect system, the cheese slices were intact but in reality it is more like Emmenthaler (with a lot of holes). One hole would not be a problem, but many slices on top of each other with the holes at the same direction create an opportunity for accidents to happen.

Holes in defences can arise for two reasons. Firstly, active failures are errors and violations caused by something or someone that is in direct contact with the system. Secondly, latent conditions are more long lasting and are incorporated in the system, because you can’t correct for all possible scenarios. As you might expect, because of the long-time existence of the latent conditions, there can be acted pro-active. The safety health can be measured and there can also be more attention for the vital signs (planning, scheduling, training, designing, communicating, building, operating and maintaining).

When organizational accidents occur is it important to focus not on who did it, but on how and why did the defences fail? Accidents give an opportunity to identity where things went wrong, for example the work pressure or inadequate training.

Causes of Error

Now, around 70-80% of all the accidents in technology are caused by human error. In 1960 it was only 20%. There are several reasons why it seems that humans became more fallible in the past years. The material and mechanical elements of technology became more reliable in the past years because of more knowledge. Also between 1970-1980 automation increased dramatically, which lead to more control by fewer operators. Lastly, the automation had three consequences: it can lead to a higher burden of the individual due to the workload, technological layers make it less transparent to the operators and although automation reduces slips and lapses it places a greater load upon the individuals reasoning skills.

Two Kinds of Accidents

By now it is safe to say that accidents can be categorized into two types:

Firstly, individual accidents: Frequent and have limited consequences. Occur in systems where there are few defences and arise from limited causes: slips, trips and lapses. These accidents are caused by the failure in the personal protection.

Secondly, organizational accidents: Rare occurrences that are more widespread and have devastating consequences. In these systems there are more defences. These big bangs are low in frequency and high in severity. These accidents are caused by a combination of multiple failures that are linked.

A key question is: Do individual accidents provide a reliable guide to a system’s vulnerability to organizational accidents? The answer is no. The road to a big disaster is paved with declining or low lost time frequency rates (LTIFR’s).

Resisting Change in Organizational Culture - Chapter 11

Resisting Change in Organizational Culture - Chapter 11

Because of the many organizational accidents in the 1980s, the 1990s became the safety culture decade. Instead of focusing on the computer-based technologies (chemical processes, nuclear power etc.), the focus shifted to more traditional industries like infrastructure, mining and construction.

There are 3 ways in which a not accurately applied safety culture can negatively undermine a complex system’s protective layers (the slices of the Swiss Cheese Model):

  • A poor safety culture will increase the number of defensive weaknesses caused by active failures (errors and violations). This will happen more in organizations that are neglectful in identifying error traps. The more dangerous thing is that a poor safety culture will also encourage an atmosphere of non-compliance.
  • Inability to recognize and an inability to respect the operational hazards can lead to more long-lasting holes in the defensive layers. This can arise through underlying conditions during the maintenance, testing, adjustment, the wrong equipment or through downgrading the importance of training.
  • The reluctance of an organization to deal proactively with their known deficiencies.

If there is one phrase that captures the essence of an unsafe culture it should be the unwarranted insouciance.

What Makes a Safe Culture?

Karl Weick says that the power of a safe culture lies in instilling a ‘collective mindfulness’ of the many entities that can penetrate, disable or bypass the system’s safeguards. Weick calls reliability and safety dynamic non-events. This means that people assume nothing bad will happen today because they acted the same as yesterday, and yesterday nothing bad happened.

Gradations of Cultural Change

At any time an organization can be in the stages mentioned below. This is a continuum that presents the path along change. Only the last one mentioned is the one that involves a successful transition in change.

  • State 1: Don’t accept the need of change: The managers are happy with the status quo, they do not belief they have a problem and are satisfied with the way they are achieving their targets.
  • State 2: Accept the need for change, but don’t know where to go. There is a concern over a series of bad events. They recognize that the existing safety measures are not accurate, but the cultural deficiencies are not understood.
  • State 3: Know where to go, but don’t know how to get there. Acknowledge that the existing safety measures are less than adequate, and accept the cultural deficiencies but unsure how to make the necessary improvements.
  • State 4: Know how to get there, but doubt whether the organization can afford it. Current projects are overrunning the budget, so they are willing but not able.
  • State 5: Make changes, but do them only cosmetically. Making changes, but with short cuts.
  • State 6: Make changes, but no good comes of them. The model of the change in the origination is not realistic and does not align with the real world.
  • State 7: Model aligns today but not tomorrow. The change only brings limited benefits due to unforeseen changes.
  • State 8: Successful transition. The change in the organization keeps up with the dynamic world and brings benefits.

Vulnerable System Syndrome

It is usually bad luck when the holes in the defensive system align to create an error. However some organizations are especially prone to having accidents. These organizations are suffering from the Vulnerable System Syndrome (VSS) and have three interacting and self-perpetuating elements: blaming front-line operators, denying the existence of systemic error provoking features and the blinkered pursuit of the wrong kind of excellence. VSS is present in some degree in all organizations; it is a matter on how an organization is taking effective remedial action. Blame and denial are the more dangerous elements of the VSS and differ on organizational and personal level.

At the personal level of the dynamics of blame and denial, there are several factors that influence the level:

  • Fundamental attribution error: When someone else is performing less, we blame it on the person. When we perform less we attribute it to situational factors.
  • Illusion of free will: People (especially in the west) value the belief that we are in control of our own destinies.
  • ‘Just world’ hypothesis: This is the belief that bad things happen to bad people and vice-versa. This also means that the person is judged by the severity of the outcome.
  • Hindsight bias: The tendency to see past events as more foreseeable on the spot than they actually where at that time.
  • Outcome bias: The tendency to evaluate prior decisions according to the outcome. Another belief is that bad outcomes can only come from bad decisions, but history tells us that belief is not true.

At the organizational level of the dynamic of blame and denial, there also some factor that influence the level:

  • Shooting or discounting the messenger: Ron Westrum distinguished three kinds of safety cultures: pathological (organizations that shoot the messenger and ignore or deny the information), bureaucratic (the large majority of the organizations that listen to the messenger but don’t really know what it means) and generative (these organizations welcome the messenger and praise him for it and treat the message very seriously).
  • Principles of least effort: It is fairly easy to find a mistake that an individual made, which in some organizational causes the investigation into the error to stop early.
  • Principles of administrative convenience: By restricting the area of investigations to the persons that are directly in contact with the system, it is easier to blame someone.
  • Entrapment: Weck called the culture of entrapment “through repeated cycles of justification, people enact a sensible world that matches their belief, a world that is not clearly in need of change”.
  • Organizational Silence: A climate in an organization where people don’t speak up when they feel there is an issue.
  • Workarounds: When looking at the people working front-line, we see that for the most part they are solving local problems (daily tweaking, messaging, adjusting) to get the job done. When there is a problem they tend to work around it instead of fixing the underlying organizational problems.
  • The normalization of deviance: Certain defects become so commonplace and so apparently inconsequential that their risk significance is gradually downgraded, so they are more seen as routine wear and tear.

Cultural Strata

Patrick Hudson expended the typology of the organizational safety cultures (Ron Westrum). The identified stages that each have to be passed through to move on to the next level. The first stage is pathological where blame and denial are the cheaper and faster way to solve problems. The second stage is reactive where safety is only given attention after an event because there is concern about adverse publicity. The third stage is bureaucratic/calculative, where there are systems to manage safety but often only because there was external pressure and also strictly by the book. The fourth stage is the proactive stage. This stage entails the awareness of the error traps in the system and they then seek to eliminate them before they happen. The final stage is generative, in this stage risks are anticipated, respected and responded to. It is an adaptive, flexible and learning culture that strives for resilience.

Medical Error - Chapter 12

Medical Error - Chapter 12

The current widespread concern with patient safety started with the published US Institute of Medicine (IOM) report ‘To Err is Human’. This report was quickly followed by a series of national reports that focused on the life lost and the money that was wasted. Around 1 in 10 patients in acute care hospitals were killed or injured as a consequence of a medical error. These medical errors are not truly medical, they are errors that occur in a medical setting.

The Paradox of health care

All paradoxes have at least two contradictory elements.

The first part of the paradox is that health care training is based on a belief that training leads to perfectibility. After the education in medical school they expect you to get it right. That means that errors are equated to incompetence and as a result there has been no tradition of reporting or learning from errors. It is also the case that students do not learn about error-producing situations.

The second part of the paradox focuses on the two most error-producing activities: aircraft maintenance and delivering health care. This is not a very useful model, but it does show the large differences between the two domains:

  • Huge diversity of activities and equipment: Long-distance pilots only fly a few types of airplanes whereas health professionals have to work with a wide variety of equipment.
  • Hands-on work with limited safeguards: Pilots are not encouraged to touch the flying controls (most planes fly on automatic pilot) but health care work is very hands-on. But the more you touch, the more mistakes you can make.
  • Vulnerable and needy patients: Unlike passengers of an airplane, patients are sick or injured. There is also the lethal convergence of benevolence, that means that because the health professionals care about their patients, they do whatever necessary to safe them even if that man’s breaking the rules of bypassing safeguards.
  • Local event investigation: Adverse events are mostly investigated locally. This means that the lessons learned are not widely published.
  • One-to-one or few-to-one delivery: Health care is very personal.

In short, health care is very error provoking in nature, yet the training in error is non-exiting and the fallibility is stigmatized (this is the paradox).

Models of Medical Errors

There are two unhelpful models: the Plague Model and the Legal Model. And there are two useful models: the person model and the system model.

The Plague Model: One reaction to the high incidence of errors in health care was that it is a ‘national problem of epidemic proportions’. This model want to eliminate the human error, but unlike an actual disease there is no cure. So, we can’t fundamentally change the condition but we can change the conditions under which the errors are made. Another view is that human errors are the product of deficiencies in the human condition. If you hold on to that view the only solution would be to advance the levels of automation to keep humans out of the loop. The problem with this model is that it sees human error as something bad, which is not the case.

The Legal Model: The most important thing from this model is that it has the view that trained professionals should not make mistakes. Errors are rare occurrences, but the ones that cause bad consequences are seen as neglect or recklessness. However, research shows that errors are not rare, and that even highly trained professionals make a lot of mistakes. Especially in health care, errors are very common and are often corrected. Clearly, training does not diminish the chance of making mistakes.

The Person Model: This model sees errors as the product of stubborn mental processes: forgetfulness, inattention, preoccupation, distraction, ignorance, carelessness etc. Remedial measures focus on the patient-professional interface and include: naming, shaming, blaming, retraining, fear appeals and writing another procedure. Although blaming is very common, it is very ineffective and sometimes even counterproductive. It isolates a person from the context, and sits in the way of identifying the errors traps. We should be talking about error-prone situations instead of error-prone people.

The System Model: The base of this model is that humans are fallible and errors are to be expected. Errors are sees as consequences and not as causes. When an adverse event occurs, you should be focused on why did the defences fail instead of who did it.

Conclusion

Since the 1990s there is a great improvement in the patient safety issue that is made more salient and seems to be more significant nowadays. The success can’t be seen in the dramatic decrease in errors, but there are some small progresses in some areas (like ward design, use of checklists). But professionals in health care are still very fallible and the delivery of health care is very error provoking. The changes that are made with this movement are not in lowering the errors but changing the way we look at the errors and that we understand the nature of medical errors that much better.

Disclosing Error - Chapter 13

Disclosing Error - Chapter 13

Should healthcare professionals tell their patients that they made an error – even when it has minor consequences? Or would this lead to more persons involved in a lawsuit? These questions are central to this question. This chapter will focus on the developments regarding this subject in the US.

Reporting Systems

Next to the patient safety movement, the IOM report also set in motion a series of events that led directly to the disclosure issue (to tell or not to tell). The publication led to an increase in malpractice premiums (health insurance) and lead to more state mandatory reporting systems. These systems serve three purposes:

  • They protect patients with the knowledge that any serious errors must be reported and will be investigated.
  • They give healthcare organizations an inventive to have a systematic approach to patient safety. The system approach brings out mistakes into the open in order to fix the organizational flaws and avoid future lapses. The tort system blames individuals for their errors and uses punishments to avoid future lapses.
  • Mandatory reporting systems require all healthcare facilities to make some level of investment in patient safety.

Reporting Obstacles

As mentioned before, errors in healthcare are not accepted nor expected. They are stigmatized, marginalized and equated with medical incompetence. It is only logic that the new reporting systems made health professionals hesitant, because this could lead to more malpractice suits. However, research shows that there is no causal connection between mandatory reporting and more malpractice suits.

Whereas disclosing (reporting mistakes can happen) has been around a little longer, the practice of offering an apology is new. The Veterans Affairs Medical Centre (VAMC) gets the credit for getting the ball rolling regarding making apologies. In 1987 they adopted the ‘disclose-apologize-compensate’ approach, which is in sharp contrast with the ‘deny and defend’ approach. 12 years later it was confirmed that this new approach was the way to go in the medical profession. Research concluded that compensating patients and meeting their interest was relatively inexpensive in comparison with a malpractice lawsuit. Nancy Lamo suggests that the IOM and the VAMC reports brought about a cultural change in terms of transparency, disclosure and apology in the medical profession.

In 2005 Barack Obama and Hilary Clinton proposed a bill to establish the National Medical Error Disclosure and Compensation Program that built further on the Patient Safety and Quality Improvement Act that in 2006 was still pending in the senate.

How an apology is framed and delivered is important. Albert Wu (Johns Hopkins) pointed out that is not so much what is said but what the recipient hears that matters. Lee Taft argues that for an apology to be authentic is must contain:

  • An acknowledgment that a rule or protocol has been violated
  • An expression of genuine remorse and regret for any harm caused
  • An explicit offer of restitution
  • A promise of reform

The University of Michigan Health System (UMHS) Program: A Before and After Study

The study compares liability claims before and after the implementation of the UMHS disclose-with-offer program (disclosing and compensating). The results show fewer lawsuits, faster resolution and lower costs. Richard Boothman said that honesty is the basis of the three principles that are embodies in the UMHS study. First, when inappropriate medical care costs any harm, the provider of the care owes the patient a quick compensation. Second, when there was no patient injury the caregivers should get a thoughtful and vigorous defence. Third, the underlying aspect is the need to learn from patients’ experiences to improve the overall quality of care.

Criticism on this study

Of course, not everyone agrees. David Studdert, Dr. Atul Gawanda and Dr. T. Brennan are three very influential people in the medical world that co-wrote a paper that has a contrary view on the IOM report. They wrote that the chance that disclosure would decrease either the frequency or cost of malpractice litigation was remote. They even found it to be likely that the costs and litigation volume was to rise. The conclusions were based on the following observations: a vast majority of patients even suffering form a result of a medical error never sues, one of the reasons for this is that most people don’t know that they are a victim of a medical error, so the authors conclude that if these people are made aware of the fact that they were victims the number of claims will be rising.

Looking back - Chapter 14

Looking back - Chapter 14

Recap

Looking back on everything we learned from this book, starting in James Reason’ kitchen when he was making tea and it ended with health care and the court of law. Together with many institutions that passed by, there was also paid attention to different continents. Two things were discovered while doing research on this topic: learn as much as possible about the details of how people in these various domains work and never be judgemental.

James Reason has been employed full-time by two universities: the University of Leicester and the University of Manchester. He also mentions his mentors: Jens Rasmussen, Donald Broadbent, Don Norman and Berndt Brehmer. During the period between 1970s and 1980s he was busy with creating classifications between the varieties of unsafe acts. One of the earliest classifications is the one mentioned below:

  • Slips and Lapses vs. Mistakes
  • Rule-Based and Knowledge-Based Mistakes
  • Errors and Violations
  • Active vs. Latent Human Failures

Some distinctions

At lot of the last distinction was inspired by the observations of Mr. Justice Sheen, regarding the capsize of the Herald of Free Enterprise (a ship). Some of the latent failures regarding the capsize were: insufficient ballast pumps, no remote bridge indicators and inadequacy of the scuppers and inadequate checking of the passengers aboard.

The distinction between active and latent failures is based upon the length of time before the failures have a bad outcome, and where in the organization the failures occur. Those in direct contact with “the system” commit active failures, these failures have an immediate short-lived impact. Latent failures have a delayed effect that can be going on for years and are committed by those who are less close to the system (higher-up).

JoHo nieuwsupdates voor inspiratie, motivatie en nieuwe ervaringen: winter 23/24

Projecten, Studiehulp en tools:

  • Contentietools: wie in deze dagen verwonderd om zich heen kijkt kan wellicht terecht op de pagina's over tolerantie en verdraagzaamheid en over empathie en begrip, mocht dat niet voldoende helpen check dan eens de pagina over het omgaan met stress of neem de vluchtroute via activiteit en avontuur in het buitenland.
  • Competentietools: voor meer werkplezier en energie en voor betere prestaties tijdens studie of werk kan je gebruik maken van de pagina's voor vaardigheden en competenties.
  • Samenvattingen: de studiehulp voor Rechten & Juridische opleidingen is sinds de zomer van 2023 volledig te vinden op JoHo WorldSupporter.org. Voor de studies Pedagogiek en Psychologie kan je ook in 2024 nog op JoHo.org terecht.
  • Projecten: sinds het begin van 2023 is Bless the Children, samen met JoHo, weer begonnen om de slum tours nieuw leven in te blazen na de langdurige coronastop. Inmiddels draaien de sloppentours weer volop en worden er weer nieuwe tourmoeders uit deze sloppen opgeleid om de tours te gaan leiden. In het najaar van 2023 is ook een aantal grote dozen met JoHo reiskringloop materialen naar de Filipijnen verscheept. Bless the Children heeft daarmee in het net geopende kantoortje in Baseco, waar de sloppentour eindigt, een weggeef- en kringloopwinkel geopend.

Vacatures, Verzekeringe en vertrek naar buitenland:

World of JoHo:

  • Leiden: de verbouwing van het Leidse JoHo pand loopt lichte vertraging op, maar nadert het einde. Naar verwachting zullen eind februari de deuren weer geopend kunnen worden.
  • Den Haag: aangezien het monumentale JoHo pand in Den Haag door de gemeente noodgedwongen wordt afgebroken en herbouwd, zal JoHo gedurende die periode gehuisvest zijn in de Leidse vestiging.
  • Medewerkers: met name op het gebied van studiehulpcoördinatie, internationale samenwerking en internationale verzekeringen wordt nog gezocht naar versterking!

Nieuws en jaaroverzicht 2023 -2024

  

  

   

    

   

Summaries and study assistance per related study programme

  

 

JoHo: crossroads uit de bundels