References

Brindley PG. Patient safety and acute care medicine: lessons for the future, insights from the past. Crit Care. 2010; 14:(2) https://doi.org/10.1186/cc8858

Bromiley M. The husband's story: from tragedy to learning and action. BMJ Qual Saf.. 2015; 24:(7)425-427 https://doi.org/10.1136/bmjqs-2015-004129

Paramedic Curriculum Guidance.COP: Bridgwater; 2015

Cosby KS. Authority Gradients and Communication.Philadelphia: Lippincott Williams & Wilkins; 2009

Crosskery P. From mindless to mindful practice – cognitive bias and clinical decision making. N Engl J Med.. 2013; 368:(26)2445-2448 https://doi.org/10.1056/NEJMp1303712

Does stress lead to a loss of team perspective?. 2000. https//tinyurl.com/y7v36gdl (accessed 6 January 2019)

Etchells E. Anchoring bias with critical implications. AORN J.. 2016; 103:(6)

Endsley MR. Toward a theory of situation awareness in dynamic systems. Human Factors. 1995; 37:(1)32-64 https://doi.org/10.1518/001872095779049543

Greig PR, Higham H, Nobre AC. Failure to perceive clinical events: an under-recognised source of error. Resuscitation.. 2014; 85:(7)952-956 https://doi.org/10.1016/j.resuscitation.2014.03.316

Haerkens MH, Kox M, Lemson J, Houterman S, van der Hoeven JG, Pickkers P. Crew resource management in the intensive care unit: a prospective 3-year cohort study. Acta Anaesthesiol Scand.. 2015; 59:(10)1319-1329 https://doi.org/10.1111/aas.12573

Independent Review on the care given to Mrs Elaine Bromiley on 29th March 2005. 2005. https//tinyurl.com/y95y3726 (accessed 27th December 2018)

Continuing professional development and your registration.London: HCPC; 2017

Kreitz C, Furley P, Memmert D, Simons DJ. Inattentional blindness and individual differences in cognitive abilities. PLOS One. 2015; 10:(8) https://doi.org/10.1371/journal.pone.0134675

LaSala CA, Bjarnason D. Creating workplace environments that support moral courage. Online J Iss Nurs.. 2010; 15:(3) https://doi.org/10.3912/OJIN.Vol15No03Man04

MacDonald RD. Articles that may change your practice: crew resource management. Air Med J.. 2016; 35:(2)65-66 https://doi.org/10.1016/j.amj.2015.12.010

Mercer S, Park C, Tarmey NT. Human factors in complex trauma. Contin Educ Anaesth Crit Care Pain. 2014; 15:(5)231-236 https://doi.org/10.1093/bjaceaccp/mku043

Moriarty D. Automation management. In: Moriarty D. London: Elsevier; 2015

Murata A, Nakamura T, Karwowski W. Influence of cognitive biases in distorting decision manking and leading to critical unfavourable events. Safety. 2015; 1:(1)44-58 https://doi.org/10.3390/safety1010044

Neisser U. The control of information pickup in selective looking in Perception and its Development.Hillsdale NJ: Lawrence Erlbaum Associated; 1979

Nicholson J. Response bias. In: Nicholson J. Oxford: Oxford University Press; 2014

Nickerson RS. Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol.. 1998; 2:(2)175-220

Ogdie AR, Reilly JB, Pang WG Seen through their eyes: residents' reflections on the cognitive and contextual components of diagnostic errors in medicine. Acad Med.. 2012; 87:(10)1361-1367

Peters GA, Peters BJ. Medical error and patient safety: human factors in medicine.Boca Raton FL: CRC Press; 2007

Redelmeier DA, Shafir E, Aujla PS. The beguiling pursuit of more information. Med Decis Making. 2001; 21:(5)376-381

Roese NJ, Vohs KD. Hindsight bias. Perspect Psychol Sci.. 2012; 7:(5)411-426 https://doi.org/10.1177/1745691612454303

Spasonik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak.. 2016; 16

Sato H, Takenaka I, Kawahara JI. The effects of acute stress and perceptual load on distractor interference. Q J Exp Psychol (Hove). 2012; 65:(4)617-623 https://doi.org/10.1080/17470218.2011.648944

Sexton B, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ.. 2000; 320:(7237)745-749

Soar J, Mancini ME, Bhanji F Part 12: Education, Implementation, and teams 2010 International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science with Treatment Recommendations. Resuscitation. 2010; 81:288-330 https://doi.org/10.1016/j.resuscitation.2010.08.030

Stress, cognition, and human performance: a literature review and conceptual framework. 2004. https//tinyurl.com/y99rtypt (accessed 28th December 2018)

Steigler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected in anaesthesiology: a literature review and pilot study. Br J Anaesth.. 2012; 108:(2)229-235 https://doi.org/10.1093/bja/aer387.Epub2011

Pierre M, Scholler A, Strembski D, Breuer G. External assistants and nursing staff safety-relevant concerns [originally in German]. Der Anaesthesist. 2012; 61:(10)857-866 https://doi.org/10.1007/s00101-012-2086-1

Toft B, Reynolds S. Learning from disasters: a management approach, 3rd edn. Leicester: Perpetuity Press; 2005

Tschan F, Semmer NK, Gurtner A Explicit reasoning, confirmation bias, and illusory transactive memory: a simulation study of group medical decision making. 2009; 40:(3)271-300 https://doi.org/10.1177/1046496409332928

Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974; 185:(4157)1124-1131

Tyler TR. Why people cooperate: the role of social motivations, 1st edn. Princeton NJ: Princeton University Press; 2011

Wright MC, Endsley MR. Building shared situation awareness in healthcare settings. In: Nemeth CP. Aldershot: Ashgate Publishing Limited; 2008

Yeung J, Perkins G, Davies R Introducing non-technical skills teaching to the Resuscitation Council (UK) Advanced Life Support Course. Resuscitation. 2014; 85:(1)

Business Continuity Plan for Hazardous Area Response Team. V. 5.0.Wakefield: YAS; 2017

Human factors, cognitive bias and the paramedic

02 January 2019
Volume 11 · Issue 1

Abstract

The consequences of human factors and cognitive bias can be catastrophic if unrecognised. Errors can lead to loss of life because of the flawed nature of human cognition and the way we interact with our environment. Seemingly small mistakes or miscommunications can lead to negative outcomes for patients and clinicians alike. It is easy to see therefore why the College of Paramedics now recommends the teaching of human factors at higher education institutions. Using a problem-based approach, this article aims to inform prehospital clinicians about how human factors and cognitive bias can affect them and their practice, and how these can be mitigated.

Human factors include organisational and environmental factors, job-specific issues and the effects of individual characteristics on interaction and communication (MacDonald, 2016). In the case of human factors in health care, this generally means the way clinicians interact with their work environment, interprofessional communication and communication between clinicians and patients. Perhaps more importantly, it also refers to how we mentally interpret these interactions and how we make decisions based upon them. The importance of these factors and their link to avoidable harm and death is being increasingly recognised within health care (Brindley, 2010; Haerkens et al, 2015). The College of Paramedics (CoP) (2015) has recognised this and now recommends the teaching of human factors to new clinicians within their curriculum guidance. This doesn't, however, reach the staff that are already working in clinical practice.

Croskerry (2013) suggests cognitive biases arise because of the way human beings process data. This occurs in one of two ways; as either Type I ‘intuitive’ or Type II ‘analytical’ processing. Intuitive reasoning, as the name suggests, offers fast answers, but with little reasoning or insight into the information process. Conversely, analytical reasoning is generally a slow, deliberate and rational process. Contrary to what we may wish to believe about ourselves as clinicians, Croskerry (2013) suggests that many of our decisions are made as part of type I processing. This predominantly involves pattern recognition and application of experience rather than in-depth analytical cognition. In a relatively recent systematic review, 71% of studies found an association between cognitive bias and medical mismanagement (Saposnik, 2016). An understanding of human factors would include an appreciation of the effects of cognitive bias. Unlike human factors, however, there is little reference to cognitive bias training within the CoP (2015) Curriculum Guidance.

The present article offers scenario-based examples demonstrating the effects of human factors and cognitive bias, which the author would suggest that clinicians and undergraduates will identify with through their experience of clinical practice. This article also covers mitigation of these factors in order to aid safe and effective practice.

Human factors

Authority gradient

Scenario

You are a newly qualified paramedic working with an experienced clinical team leader on your first shifts following registration. You attend a patient who has chronic obstructive pulmonary disease (COPD) and your crewmate is attending. The patient has a respiratory rate of 28 breaths per minute and a pulse rate of 120 per minute. He has a history indicative of infection and is pyrexial. To your surprise, your colleague begins to discuss GP referral and explains to the patient that he should expect to be short of breath owing to his infection and medical history. You know this is not the correct course of action, but you feel uncomfortable challenging your colleague.

Definition

Perceived authority in a healthcare setting is usually correlated to qualification, job role and experience—though other demographic factors such as gender, ethnicity and age may also play a role (Cosby, 2009). Relating to the scenario described, a newly qualified paramedic will hold less ‘authoritative currency’ than a senior paramedic who has years of experience. The existence of authority in health care can be a double-edged sword. Authority can be used positively to motivate staff and assist with conflict resolution (Tyler, 2011). However, authority gradients become an issue when junior members of staff recognise, but fail to speak up about, errors being made by authority figures. The problem is not the error but the difficulty that junior members of staff may have with raising their concerns. Multiple real-world cases have demonstrated the consequences of unchallenged errors made by those in a position of power (Cosby, 2009; Bromiley, 2015). A simulation study conducted in Germany found that despite identification of errors by the junior clinician, successful interventions to halt these were made less than 10% of the time (St. Pierre et al, 2012).

A large proportion of the available evidence relating to authority gradient pertains to the anaesthetic environment. Similarly, it can be reasoned that paramedics train to deal with airway management and medical emergencies in uncontrolled and sometimes stressful environments. Paramedics predominantly work as part of a team of mixed experience and differing skill sets. It should be noted, however, that the anaesthetic environment differs greatly from the prehospital environment in terms of staffing levels and grades, access to advanced equipment and environmental control. Despite this, the author would suggest that studies can still draw conclusions about human fallibility, as this occurs in all settings. The author therefore believes that study data and conclusions are relevant to clinicians in prehospital care.

Mitigation

Mitigating the risk that authority gradients pose to safe patient care is critical. Effective mitigation, however, relies on multiple factors. Firstly, staff need to feel supported and empowered within their work environment. Empowered staff are more likely to demonstrate moral courage, and speak up. Recognising these actions as courageous is important as staff may feel at risk of segregation, bullying or damaging career progression opportunities by alienating peers (LaSala and Bjarnson, 2010). Sexton et al (2000) point out that clinicians need to be well educated in human factors and their consequences. They show that experienced clinicians will deride steep authority gradients, because they understand the inherent risk in propagating these attitudes. In comparison, clinical staff that are untrained and unaware of the dangers of authority gradients are much more likely to support steep hierarchies.

St. Pierre et al (2012) suggest a communication framework which is understood by all staff as a ‘red flag’ from a clinician addressing a patient safety issue. This framework is designed to aid staff in stressful situations to assemble their thoughts in a logical and structured manner (Box 1).

Communication framework

  • Address the person concerned.
  • Bring your point of view in the form of an ‘I-Message’. Include keywords, for example ‘uncomfortable with’ or ‘concerned about’.
  • Clarify the problem or express your fears.
  • If necessary, propose alternatives.
  • Ask for an opinion, ‘What do you think?’.
  • Source: St. Pierre et al, 2012

    Box 2 shows an example of this framework that the author has adapted to the described scenario.

    Framework adapted to scenario

  • ‘*Clinician Name*,
  • I am uncomfortable with referring this patient to his GP.
  • I think he may have sepsis and if we leave him at home he could deteriorate rapidly.
  • I think we need to screen him for sepsis using our trust screening tool’ if he is likely to have sepsis then he needs to go to hospital.
  • What do you think?’
  • Source: Adapted from St. Pierre et al, 2012

    Bandwidth and loss of situational awareness

    Scenario

    The importance of situational awareness (SA) can be demonstrated by the well known case of Elaine Bromiley. Elaine attended a clinic for routine surgery, which was to be performed under a general anaesthetic. A loss of SA meant that when anaesthetists experienced a ‘can't intubate, can't ventilate’ scenario, they repeatedly attempted to re-intubate the patient, rather than performing an emergency tracheostomy. Despite a nurse verbalising the availability of emergency tracheostomy equipment, it remained unused. Elaine's oxygen saturations remained at 40% for 20 minutes until anaesthetists managed to increase her saturations. The operation was eventually abandoned and Elaine was taken to recovery to wake up without ventilatory support. She was later admitted to the intensive care unit (ICU) and was found to have suffered a severe hypoxic brain injury. Ultimately, life support was withdrawn and Elaine Bromiley died 6 days later.

    (Harmer, 2005; Bromiley, 2015)
    Human factors in health care generally refers to the way clinicians interact with their work environment

    Definition

    Health care borrows the concept of bandwidth from information technology. This describes the human brain as a processor, more importantly acknowledging it as a processor of limited power. Available mental processing to deal with a task is defined as ‘bandwidth’ (Mercer et al, 2014).

    SA is a complex topic and factors that result in a loss of SA are numerous. It is therefore impossible to comprehensively cover this within the current article; indeed, entire books have been written about SA. However, the author will attempt to discuss some of the more pertinent points around this subject in the prehospital environment.

    Greig et al (2014) defines SA as the ability of an individual to process information about the environment in which they are functioning. Similarly, Wright (2012) describes SA as an accurate, internalised mental model. Maintaining good SA can be challenging—particularly as a result of the dynamic and unfamiliar nature of clinicians' practice within the prehospital environment. The value of maintaining good SA cannot be overestimated.

    One experiment assessing perception of safety-critical events during a simulated clinical scenario found that these were missed by two-thirds of the most highly experienced observers (Greig et al, 2014). In light of such worrying statistics, organisations are now taking note of the importance of SA. Following recommendations, training on SA has now been incorporated into advanced life support courses (Soar et al, 2010; Yeung et al, 2014).

    Endsley (1995) presents the seminal model of situational awareness in decision-making. The author suggests numerous factors which can negatively impact on a clinician's situational awareness. Stress is one such factor. Endsley splits stress into two categories. Social-psychological stressors, and physical stressors. Noisy, poorly-lit environments, inclement weather conditions and fatigue are all factors that are commonplace in prehospital care. Furthermore, a combination, or all of these factors could be experienced at one incident. Social-psychological stressors include mental load, time pressure and the importance or difficulty of the task being undertaken. Dealing with these factors results in a reduction of available bandwidth to deal with the tasks at hand, as they too require cognitive input.

    An experimental study by Driskell et al (2000) supports this suggestion. The authors found that participants placed under stress reported increased distraction and demonstrated a loss of team perspective. Peters and Peters (2007) support this idea, suggesting that multiple stressors occurring in rapid succession, beyond what the clinician can cope with (known as overloading), can increase the likelihood of errors. Overloading and stress can result in a narrowing of attentional focus (Staal, 2004; Sato et al, 2012) resulting in the individual responding to the most salient cues, and missing more subtle ones. Staal notes, however, that cue salience is determined unconsciously by the individual. As such, this too is prone to bias. These study data corroborate the idea of bandwidth and, more pertinently, bandwidth overloading.

    It is easy to see, therefore, in the case of Elaine Bromiley, how clinicians lost SA secondary to bandwidth overload. A high level of stress, caused by time pressure, high task importance and consequence led to a narrowing in the field of attention of the doctors attempting to intubate Elaine. This led to the exclusion of several important cues such as a nurse offering a tracheostomy kit. Considering the difficulty of the task of intubating Elaine, and the consequences of failing to do so, such communication was missed owing to the anaesthetists simply lacking the bandwidth to process the information coming from the nurse. This may be due to Staal's suggestion that cue salience is interpreted within the mind of the individual. It would make sense that the anaesthetists were focused on intubating rather than wasting bandwidth on verbal communication. It would appear therefore that this information was ostensibly discounted as extraneous and unnecessary, and ignored.

    Mitigation

    The author's experience is that some ambulance trusts will undertake a ‘general broadcast’ of calls with no resources to send, or calls whose response falls outside of the target time. This involves making an announcement over radio systems to all staff (including those who are currently dealing with patients) of category, nature and location of a call. Though the aim of this is to ensure that staff are available as soon as possible after completing calls, it may result in staff disengaging from the patient they are dealing with, especially if the call nature they overhear is emotive, such as a paediatric cardiac arrest. The author would suggest that restricting the knowledge clinicians have about increased service workload may decrease stress and allow them to focus on the task at hand. Pressure to complete calls more quickly than is realistically practicable may result in premature closure; this is another type of bias that occurs when a clinician fails to consider differential diagnoses after they have formed their initial impression (Etchells, 2016).

    The author would suggest that having a member of staff that is ‘hands off’ helps to ensure the maintenance of SA. It is an established practice in hospitals to have a team leader overseeing the treatment of critically unwell patients in the emergency department (ED). This is also becoming established practice in prehospital care. The main notion behind this practice is for the clinician to use their bandwidth to oversee, but not become involved in, patient treatment. Clinical interventions can be numerous, especially in those patients who are critically ill or injured. By not becoming involved, it is supposed that this member of staff will be able to make better decisions, owing to their lack of need to allocate bandwidth to completing interventions. Despite the anecdotal benefits, however, there appears to be a lack of available evidence to support this idea.

    There are some options to reduce physical stressors on clinicians. Moving the patient in a timely manner to an ambulance allows some relief from inclement weather and provides adequate lighting and a quiet environment to work in. If it is not possible to move the patient, resilience units nationally carry additional lighting for scenes, and tents to protect patients and crews who are likely to be exposed to the elements for long periods of time (Yorkshire Ambulance Service, 2017).

    Cognitive bias

    Scenario

    You attend a patient whom you have previously met on multiple occasions. The patient is known to suffer from panic attacks. The patient has difficulty with, but does eventually respond to, coaching. The patient always complains of muscular chest pain following the panic attacks, but says the pain feels different today. Once the patient is calm, you note they remain tachypnoeic and tachycardic. Regardless of this, you tell yourself that they suffer from severe anxiety and discharge them into the community. You later reflect on this incident, unsure if you made the correct decision.

    Confirmation bias

    Definition

    Confirmation bias is a type of psychological bias which results in decisions or beliefs being formed based on a clinician's preconceptions, with evidence to the contrary being discounted (Nickerson, 1998). Clinicians tend to overvalue the importance of clinically irrelevant information, if it confirms their preconceptions. Conversely, clinicians will underestimate the importance of relevant disconfirming information.

    The failure to adequately adjust probabilities and therefore approach as new disconfirming information becomes available is known as anchoring. This occurs concurrently with confirmation bias (Tversky and Kahneman, 1974). In the case example, the ‘new information’—altered observations and new chest pain—are discounted as irrelevant. This is because of the preconceived notion that this patient is ‘just anxious’.

    A simulation study found that confirmation bias occurred in more than 75% of simulated anaesthesiology sessions (Stiegler et al, 2012). Tschan et al (2009) demonstrate that the issue of confirmation bias is not necessarily restricted to one's own internal thought process. They showed that confirmation bias can influence auditory perception when auscultating a patient's lung sounds. In this case, multiple clinicians suggested a unilateral air entry reduction on a standardised manikin with no diminished breath sounds.

    Mitigation

    Ogdie et al (2012) assert that reflective writing and narrative discussion are useful tools in developing avoidance strategies for diagnostic error caused by cognitive bias. These tools were used in conjunction with education about cognitive biases. The author would suggest that the paramedic profession already provides the tools to tackle this issue. As previously discussed, paramedic education is moving towards increased entry level education, and the teaching of human factors is incorporated into curriculum guidance for higher education institutions. Human factors training could easily be expanded to incorporate education about cognitive bias. Furthermore, it is a requirement of all paramedics to undertake continuing professional development (CPD) as part of maintaining a professional registration with the registering body (Health and Care Professions Council (HCPC). The HCPC (2017) advocates reflective practice within its CPD guidance. Therefore, paramedics who wish to mitigate cognitive bias in their practice are already in a system designed to aid them to achieve this. Some further examples of cognitive bias in paramedic practice are displayed in Table 1.


    Type of bias Explanation
    Response bias This occurs when participants (e.g. patients) answer differently from the way they actually feel. This may be due to the way questions are phrased or because they are unwilling to be honest (Nicholson, 2014)
    Inattentional blindness Observers (e.g clinicians) fail to notice an apparently obvious event or occurrence when they are undertaking a challenging task (Neisser, 1979; Kreitz et al, 2015)
    Normalcy bias The minimisation of warning signs as ‘minor’ or ‘normal’, in order to prevent overreaction and help cope with stressful events. Failure to respond appropriately to such warnings may result in a situation becoming dangerous (Murata et al, 2015)
    Automation bias Placing a higher level of trust in erroneous data provided by automated machines (e.g. automatic blood pressure cuffs) than data provided by non-automated systems (Moriarty, 2015)
    Hindsight bias The belief that an event is more predictable, after it becomes known, than before the outcome was known (Roese and Vohs, 2012)

    Availability heuristic

    Definition

    Bias can also occur as a result of the availability heuristic. This is where the probability of a future event occurring is measured by the ease of which a similar past event can be recalled (Tversky and Kahneman, 1974; Toft and Reynolds, 2005). This type of bias can be seen when the number of airline ticket sales decrease following a high-profile accident. Toft and Reynolds (2005) suggest that this is because probability of that event occurring subjectively increases in the mind of the passenger. Though, in reality, it is no more likely that there will be a further accident.

    In relation to prehospital care, relying unduly on what one has witnessed previously, rather than carefully evaluating the patient's condition and objectively assessing the likely probabilities, the clinician risks making a biased decision. Linking this to the described scenario, it is easy to see how apparently concerning symptoms could be discounted as a result of this cognitive bias.

    Mitigation

    Redelmeier et al (2001) published a number of recommendations that apply to availability heuristic and its role in decision-making. Primarily, they suggested that clinicians consider whether missing information is relevant. Spending large amounts of time performing unnecessary tests may result in irrelevant test results altering patient treatment and distracting from the presenting complaint. They also suggest that clinicians plan in advance of data being available. In prehospital care, this may be as simple as a clinician deciding, ‘If this patient has a GCS [Glasgow Coma Score] of less than 14 following a head injury, I will convey them to a major trauma centre.’ In this way, the act of gathering data cannot affect the decision-making; rather, decisions are dictated by the data. Finally, the authors suggest that clinicians share decision-making with colleagues who can independently review the data, as they haven't been involved in the gathering process. The author would also suggest that clinicians should attempt to discount previous calls and treat each case individually as it presents itself.

    Conclusion

    Much of the evidence surrounding human factors and cognitive bias comes from the hospital or primary care environment. There appears to be little evidence looking specifically at human factors in the prehospital setting. The current article is not exhaustive; however, the author has tried to cover a range of human factors and cognitive biases pertinent to clinicians working in the prehospital environment, such as authority gradients, situational awareness, bandwidth, confirmation bias and availability heuristic.

    The overarching theme throughout the author's research for this article was that an awareness of human factors and cognitive bias mitigated some of the risks these subjects pose. As such, the author hopes the article will inform clinicians about issues they face in practice, and help them to overcome these, to aid safe and effective clinical care.

    Key points

  • The consequences of human factors and cognitive bias can be catastrophic if unrecognised
  • The College of Paramedics now recommends the teaching of human factors at higher education institutions
  • Reflective writing and narrative discussion can be useful tools in developing avoidance strategies for diagnostic error caused by cognitive bias
  • CPD Reflection Questions

  • Can you think of any occasions where human factors, or cognitive bias may have influenced your practice?
  • What strategies could be used to mitigate the effects of human factors in the future?
  • How could regional ambulance services use knowledge about human factors to support their staff?