Banner Ad

We all make errors. Let’s not judge those involved without understanding...

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • GLOC
    Moderator
    • Dec 2012
    • 2842

    We all make errors. Let’s not judge those involved without understanding...

    We all make errors. Let’s not judge those involved without understanding the ‘how’ it made sense. - Original blog here

    “People make errors, which lead to accidents. Accidents lead to deaths. The standard solution is to blame the people involved. If we find out who made the errors and punish them, we solve the problem, right? Wrong. The problem is seldom the fault of an individual; it is the fault of the system. Change the people without changing the system and the problems will continue.”
    Don Norman. Author, the Design of Everyday Things
    When looking at failures we need to understand why it made sense for those involved to behave or operate the way they did. (See post about local rationality). Human Factors and Ergonomics (HF/E) is a recognised science which deals with this and really came to the fore in World War II with two events. The first was the use of checklists to improve aviation safety following the crash of the prototype B-17, the second was looking at the ‘system’ because technologies were advancing faster than humans could adapt and errors were still happening.

    In 1943 the U.S. Air Force called in psychologist Alphonse Chapanis to investigate repeated instances of pilots making a certain dangerous and inexplicable error: The pilots of certain models of aircraft would safely touch down and then mistakenly retract the landing gears. The massive aircraft would scrape along the ground, exploding into sparks and flames. Despite numerous recommendations that were put forward for more training and close adherence to checklists, the issues continued. Chapanis interviewed pilots but also carefully studied the cockpits. He noticed that on B-17s, the two levers that controlled the landing gears and flaps were identical and placed next to each other. These levers were like the little dolly light switches below.



    Normally a pilot would lower the landing gears and then raise the wing flaps, which act as airbrakes and push the plane down onto the wheels. But in the chaos of wartime, the pilot could easily grab the wrong lever and retract the landing gears when he meant to raise the flaps. Chapanis’s solution: attach a small rubber wheel to the landing gear control and a flap-shaped wedge to the flap control. Pilots could immediately feel which lever was the right one, and the problem went away. In this case it was clear that the problem wasn’t the pilot, but the design of the technology that surrounded him.

    Considering the above, and diving in particular, the following should be considered true.
    • The best people can make the worst mistake
    • Systems will never be perfect
    • Humans will never be perfect


    When diving incidents are discussed, the failure is often attributed to the individual for ‘not following the rules’ or ‘exceeding their certifications’ or ‘just lacking common sense, as it was obvious what was going to happen’. As diving is a recreational sport, there is very little that can be done in terms of punishment (and so it should be), but a more powerful punishment is often dispensed, criticism through social media. These include postings on social media by armchair divers who were not there, could not know the level of knowledge or training of those involved, could not know the motivation and goals of those involved (see Situational Awareness blog), and but still make judgements about ‘how stupid could they be’. Whilst the majority of diving is outside of formal rules, some are still present like in the UK when operating at a commercial site and a DCS event occurs, reporting to the site staff. The primary reason is to provide immediate first aid and evacuation if needed. If there is fear of an investigation by the HSE (or equivalent) and no reporting occurs, no-one learns from the event and more importantly, treatment can be delayed with potentially serious consequences. From a survey I conducted 2 years ago, more than half of the instructors who had responded had not reported DCS - the reasons for not reporting were not captured. Unfortunately under-reporting is a major issue across many industries which means the scale of the issues are rarely known.

    The fear of reporting, either in a constructive manner on social media, or reporting through official channels when required, is indicative of the lack of a Just Culture.

    Just Culture

    So what is a Just Culture? it is one that is open, fair, and a learning culture, and combined with the design of safe systems and managing behavioural choices, it creates an effective safety culture. The majority of research into Just Culture has been conducted in formal & established environments and so the framework is normally about punishments framed around legal or HR requirements/rules. The majority of diving is not like this, but learning across the community is still essential as there is no way in which divers can be taught everything on a training course, and some of the lessons that could be taught in the real world could end up with someone dead!

    One of the myths of the Just Culture is that it is blame free. This is not the case. If reckless behaviours have been undertaken when there is a duty of care, then it needs to be reported in the interest of safety. When outside of the duty of care construct, then risk is managed at an individual level, but risk perception and acceptance are individually referenced, therefore the feedback system is different. Not reporting your error, preventing the system from learning is the greatest problem of all and some actions do warrant disciplinary or enforcement action (when operating in an environment with a duty of care, be that voluntary or commercial), or some means of correcting what has happened so it doesn’t happen again in other environments.

    The key question is; where do you draw the disciplinary line for those with a duty of care, or negative criticism when outside of a duty of care? In order to know that, we all need to understand the differences between human error, risky behaviour and recklessness.

    The big three to consider are:

    Human error: inadvertent action; inadvertently doing other that what should have been done; slip, lapse, mistake
    Risky behaviour: choices that increase risk, where risk is not recognised or is mistakenly believed to be justified – includes violations and negligence
    Reckless behaviour: behavioural choice, intentional acts, conscious disregard to a substantial and unjustifiable risk

    The diagram below is from the UK Military Aviation Authority (MAA) and shows the spectrum of options based around a 'duty of care' construct. However, for it to work effectively, the person(s) doing the assessment need to be impartial and have no gain in any outcome of the decision making process. Such a model does not exist for recreational activities but the concepts are still valid.



    The next blog will look at these subjects in more detail and ways in which we can address the problems we face. These include learning from incidents, accidents and failure, and creating the environment whereby divers can talk about failure. If reporting isn’t happening at a commercial level, it is likely that there are bigger issues to resolve than just the immediate fallout from the current event such as why real events like DCS are not being reported?

    Until then, consider that whilst we don’t have a formal disciplinary framework for the majority of diving, we do have social judgment and you are part of that. If you see something adverse happening, hard as it may be, don’t make a judgement of why they could have made such a silly mistake, rather consider how they determined that it was a ‘safe’ decision. That decision will be based on (in)experience, motivations, previous outcomes from similar situations and training which are all likely to be different to yours.

    Learn from your mistakes, better still, learn from others’

    Footnote:

    Human Factors Skills in Diving classes teaches us, in a safe space, to learn from others, to improve our own safety methods and attitudes, it provides food for thought on how to provide a better, easier learning environment for own students and allows us to be real human beings. There is intentionally lots of failure to learn from on the course.

    More information on Human Factors Skills in Diving classes can be found at www.humanfactors.academy

    Upcoming classroom-based course dates are here https://www.humanfactors.academy/p/dates

    Online micro-class (9 modules of approximately 15 mins each) details are here https://www.humanfactors.academy/store/C3V4Jnz4
    Gareth

    www.imagesoflife.co.uk - Underwater Print Sales, Teaching and Stock Library
    www.cognitas.org.uk - Improving Safety by Challenging Current Practices
    www.divingincidents.org - Diving Incident and Safety Management System (DISMS)
    - 2014 Report here

    “Set your expectations high; find men and women whose integrity and values you respect; get their agreement on a course of action; and give them your ultimate trust.”

    “It is far better to be trusted and respected than it is to be liked.”
  • Adrian
    Remember, remember
    • Dec 2012
    • 2552

    #2
    Originally posted by GLOC
    Considering the above, and diving in particular, the following should be considered true.

    •The best people can make the worst mistake•Systems will never be perfect
    •Humans will never be perfect
    The 'best' people have made the worst mistake.

    I've added quotes to 'best' as I think in non-commercially regulated diving we have an issue with personality cults. 'Best' is often seen as meaning deeper, longer, trickier and often 'gobbier'. Those who blow their horn get more recognition over those who just do their thing quietly, therefore they must be 'better'. Yet they may not be setting the best examples, they're just getting away with it.

    If we are to establish a more just culture, somehow these best divers need to be encouraged to be setting an example in openly reporting issues. If they have the courage to do some of the dives they do, then perhaps they have the courage to go public? I've heard various people mention your work in other talks at Eurotek etc, yet I can't recall any open reports from big names re their own incidents. I think we need these examples to encourage those who might otherwise feel that they are on their own. It's hard to swim against the flow.


    Another reporting issue I've noticed is the tendency to report incidents based on the seriousness of the actual outcome, rather than the potential outcome. Therefore 'getting away with it' is not reported. Not even as a near miss. So we might not get a full picture of the scope for areas that might reap more benefit.


    Keep up the good work Gareth.

    Adrian
    Bought a house in Devon, drank cider from a lemon.

    Comment

    • GLOC
      Moderator
      • Dec 2012
      • 2842

      #3
      Adrian,

      Funny enough the point about outcome bias will be in a later blog linked to this. Industries (and indeed the HSE) are now looking at potential outcome rather than actual outcome as a measure of risk and therefore fines. The difficulty then becomes who judges an acceptable level of risk, especially when we look at recreational activities.

      Regarding asking 'names' to speak up about their adverse events, definitely. The challenge there is that litigation is still a massive player in the US and it wouldn't be unheard of for a lawyer to trawl through websites looking for their errant behaviours and citing this as a reason for why the current event happened. Creating a safety culture requires a number of pieces of the jigsaw puzzle to be in place and without doubt, the Just Culture piece is pivotal to achieving the others, especially given the current trend in the US to sue anyone or anything that was involved in the event - see the Wes Skiles case as an example. Why should a CCR manufacturer be responsible for the diver taking medication which was not conducive to safe diving, for them to dive the unit without training, to isolate the O2 feed (presumably to reduce the noise), to share a bailout between the team and leave it in the middle of the video shoot area...and the list goes on.

      From work by Hudson and Reason, a safety culture is made up of the following pieces

      • an informed culture-one in which those who manage and operate the system have current knowledge about the human, technical, organisational and environmental factors that determine the safety of the system as a whole, including the ways in which the system's defences can be breached,
      • a reporting culture: a culture in which people are willing to report errors and near misses,
      • a just culture: a culture of 'no blame' where an atmosphere of trust is present and people are encouraged or even rewarded for providing essential safety-related information- but where there is also a clear line between acceptable and unacceptable behaviour,
      • a flexible culture which can take different forms but is characterised as shifting from the conventional hierarchical mode to a flatter professional structure, and
      • a learning culture - the willingness and the competence to draw the right conclusions from its safety information system, and the will to implement major reforms when the need is indicated.

      and they all work together to improve safety and performance.

      Thanks for the support!

      Regards
      Gareth

      www.imagesoflife.co.uk - Underwater Print Sales, Teaching and Stock Library
      www.cognitas.org.uk - Improving Safety by Challenging Current Practices
      www.divingincidents.org - Diving Incident and Safety Management System (DISMS)
      - 2014 Report here

      “Set your expectations high; find men and women whose integrity and values you respect; get their agreement on a course of action; and give them your ultimate trust.”

      “It is far better to be trusted and respected than it is to be liked.”

      Comment

      Working...