Human factors include organisational and environmental factors, job-specific issues and the effects of individual characteristics on interaction and communication (MacDonald, 2016). In the case of human factors in health care, this generally means the way clinicians interact with their work environment, interprofessional communication and communication between clinicians and patients. Perhaps more importantly, it also refers to how we mentally interpret these interactions and how we make decisions based upon them. The importance of these factors and their link to avoidable harm and death is being increasingly recognised within health care (Brindley, 2010; Haerkens et al, 2015). The College of Paramedics (CoP) (2015) has recognised this and now recommends the teaching of human factors to new clinicians within their curriculum guidance. This doesn't, however, reach the staff that are already working in clinical practice.
Croskerry (2013) suggests cognitive biases arise because of the way human beings process data. This occurs in one of two ways; as either Type I ‘intuitive’ or Type II ‘analytical’ processing. Intuitive reasoning, as the name suggests, offers fast answers, but with little reasoning or insight into the information process. Conversely, analytical reasoning is generally a slow, deliberate and rational process. Contrary to what we may wish to believe about ourselves as clinicians, Croskerry (2013) suggests that many of our decisions are made as part of type I processing. This predominantly involves pattern recognition and application of experience rather than in-depth analytical cognition. In a relatively recent systematic review, 71% of studies found an association between cognitive bias and medical mismanagement (Saposnik, 2016). An understanding of human factors would include an appreciation of the effects of cognitive bias. Unlike human factors, however, there is little reference to cognitive bias training within the CoP (2015) Curriculum Guidance.
The present article offers scenario-based examples demonstrating the effects of human factors and cognitive bias, which the author would suggest that clinicians and undergraduates will identify with through their experience of clinical practice. This article also covers mitigation of these factors in order to aid safe and effective practice.
Human factors
Authority gradient
Scenario
You are a newly qualified paramedic working with an experienced clinical team leader on your first shifts following registration. You attend a patient who has chronic obstructive pulmonary disease (COPD) and your crewmate is attending. The patient has a respiratory rate of 28 breaths per minute and a pulse rate of 120 per minute. He has a history indicative of infection and is pyrexial. To your surprise, your colleague begins to discuss GP referral and explains to the patient that he should expect to be short of breath owing to his infection and medical history. You know this is not the correct course of action, but you feel uncomfortable challenging your colleague.
Definition
Perceived authority in a healthcare setting is usually correlated to qualification, job role and experience—though other demographic factors such as gender, ethnicity and age may also play a role (Cosby, 2009). Relating to the scenario described, a newly qualified paramedic will hold less ‘authoritative currency’ than a senior paramedic who has years of experience. The existence of authority in health care can be a double-edged sword. Authority can be used positively to motivate staff and assist with conflict resolution (Tyler, 2011). However, authority gradients become an issue when junior members of staff recognise, but fail to speak up about, errors being made by authority figures. The problem is not the error but the difficulty that junior members of staff may have with raising their concerns. Multiple real-world cases have demonstrated the consequences of unchallenged errors made by those in a position of power (Cosby, 2009; Bromiley, 2015). A simulation study conducted in Germany found that despite identification of errors by the junior clinician, successful interventions to halt these were made less than 10% of the time (St. Pierre et al, 2012).
A large proportion of the available evidence relating to authority gradient pertains to the anaesthetic environment. Similarly, it can be reasoned that paramedics train to deal with airway management and medical emergencies in uncontrolled and sometimes stressful environments. Paramedics predominantly work as part of a team of mixed experience and differing skill sets. It should be noted, however, that the anaesthetic environment differs greatly from the prehospital environment in terms of staffing levels and grades, access to advanced equipment and environmental control. Despite this, the author would suggest that studies can still draw conclusions about human fallibility, as this occurs in all settings. The author therefore believes that study data and conclusions are relevant to clinicians in prehospital care.
Mitigation
Mitigating the risk that authority gradients pose to safe patient care is critical. Effective mitigation, however, relies on multiple factors. Firstly, staff need to feel supported and empowered within their work environment. Empowered staff are more likely to demonstrate moral courage, and speak up. Recognising these actions as courageous is important as staff may feel at risk of segregation, bullying or damaging career progression opportunities by alienating peers (LaSala and Bjarnson, 2010). Sexton et al (2000) point out that clinicians need to be well educated in human factors and their consequences. They show that experienced clinicians will deride steep authority gradients, because they understand the inherent risk in propagating these attitudes. In comparison, clinical staff that are untrained and unaware of the dangers of authority gradients are much more likely to support steep hierarchies.
St. Pierre et al (2012) suggest a communication framework which is understood by all staff as a ‘red flag’ from a clinician addressing a patient safety issue. This framework is designed to aid staff in stressful situations to assemble their thoughts in a logical and structured manner (Box 1).
Box 2 shows an example of this framework that the author has adapted to the described scenario.
Bandwidth and loss of situational awareness
Scenario
The importance of situational awareness (SA) can be demonstrated by the well known case of Elaine Bromiley. Elaine attended a clinic for routine surgery, which was to be performed under a general anaesthetic. A loss of SA meant that when anaesthetists experienced a ‘can't intubate, can't ventilate’ scenario, they repeatedly attempted to re-intubate the patient, rather than performing an emergency tracheostomy. Despite a nurse verbalising the availability of emergency tracheostomy equipment, it remained unused. Elaine's oxygen saturations remained at 40% for 20 minutes until anaesthetists managed to increase her saturations. The operation was eventually abandoned and Elaine was taken to recovery to wake up without ventilatory support. She was later admitted to the intensive care unit (ICU) and was found to have suffered a severe hypoxic brain injury. Ultimately, life support was withdrawn and Elaine Bromiley died 6 days later.
Definition
Health care borrows the concept of bandwidth from information technology. This describes the human brain as a processor, more importantly acknowledging it as a processor of limited power. Available mental processing to deal with a task is defined as ‘bandwidth’ (Mercer et al, 2014).
SA is a complex topic and factors that result in a loss of SA are numerous. It is therefore impossible to comprehensively cover this within the current article; indeed, entire books have been written about SA. However, the author will attempt to discuss some of the more pertinent points around this subject in the prehospital environment.
Greig et al (2014) defines SA as the ability of an individual to process information about the environment in which they are functioning. Similarly, Wright (2012) describes SA as an accurate, internalised mental model. Maintaining good SA can be challenging—particularly as a result of the dynamic and unfamiliar nature of clinicians' practice within the prehospital environment. The value of maintaining good SA cannot be overestimated.
One experiment assessing perception of safety-critical events during a simulated clinical scenario found that these were missed by two-thirds of the most highly experienced observers (Greig et al, 2014). In light of such worrying statistics, organisations are now taking note of the importance of SA. Following recommendations, training on SA has now been incorporated into advanced life support courses (Soar et al, 2010; Yeung et al, 2014).
Endsley (1995) presents the seminal model of situational awareness in decision-making. The author suggests numerous factors which can negatively impact on a clinician's situational awareness. Stress is one such factor. Endsley splits stress into two categories. Social-psychological stressors, and physical stressors. Noisy, poorly-lit environments, inclement weather conditions and fatigue are all factors that are commonplace in prehospital care. Furthermore, a combination, or all of these factors could be experienced at one incident. Social-psychological stressors include mental load, time pressure and the importance or difficulty of the task being undertaken. Dealing with these factors results in a reduction of available bandwidth to deal with the tasks at hand, as they too require cognitive input.
An experimental study by Driskell et al (2000) supports this suggestion. The authors found that participants placed under stress reported increased distraction and demonstrated a loss of team perspective. Peters and Peters (2007) support this idea, suggesting that multiple stressors occurring in rapid succession, beyond what the clinician can cope with (known as overloading), can increase the likelihood of errors. Overloading and stress can result in a narrowing of attentional focus (Staal, 2004; Sato et al, 2012) resulting in the individual responding to the most salient cues, and missing more subtle ones. Staal notes, however, that cue salience is determined unconsciously by the individual. As such, this too is prone to bias. These study data corroborate the idea of bandwidth and, more pertinently, bandwidth overloading.
It is easy to see, therefore, in the case of Elaine Bromiley, how clinicians lost SA secondary to bandwidth overload. A high level of stress, caused by time pressure, high task importance and consequence led to a narrowing in the field of attention of the doctors attempting to intubate Elaine. This led to the exclusion of several important cues such as a nurse offering a tracheostomy kit. Considering the difficulty of the task of intubating Elaine, and the consequences of failing to do so, such communication was missed owing to the anaesthetists simply lacking the bandwidth to process the information coming from the nurse. This may be due to Staal's suggestion that cue salience is interpreted within the mind of the individual. It would make sense that the anaesthetists were focused on intubating rather than wasting bandwidth on verbal communication. It would appear therefore that this information was ostensibly discounted as extraneous and unnecessary, and ignored.
Mitigation
The author's experience is that some ambulance trusts will undertake a ‘general broadcast’ of calls with no resources to send, or calls whose response falls outside of the target time. This involves making an announcement over radio systems to all staff (including those who are currently dealing with patients) of category, nature and location of a call. Though the aim of this is to ensure that staff are available as soon as possible after completing calls, it may result in staff disengaging from the patient they are dealing with, especially if the call nature they overhear is emotive, such as a paediatric cardiac arrest. The author would suggest that restricting the knowledge clinicians have about increased service workload may decrease stress and allow them to focus on the task at hand. Pressure to complete calls more quickly than is realistically practicable may result in premature closure; this is another type of bias that occurs when a clinician fails to consider differential diagnoses after they have formed their initial impression (Etchells, 2016).
The author would suggest that having a member of staff that is ‘hands off’ helps to ensure the maintenance of SA. It is an established practice in hospitals to have a team leader overseeing the treatment of critically unwell patients in the emergency department (ED). This is also becoming established practice in prehospital care. The main notion behind this practice is for the clinician to use their bandwidth to oversee, but not become involved in, patient treatment. Clinical interventions can be numerous, especially in those patients who are critically ill or injured. By not becoming involved, it is supposed that this member of staff will be able to make better decisions, owing to their lack of need to allocate bandwidth to completing interventions. Despite the anecdotal benefits, however, there appears to be a lack of available evidence to support this idea.
There are some options to reduce physical stressors on clinicians. Moving the patient in a timely manner to an ambulance allows some relief from inclement weather and provides adequate lighting and a quiet environment to work in. If it is not possible to move the patient, resilience units nationally carry additional lighting for scenes, and tents to protect patients and crews who are likely to be exposed to the elements for long periods of time (Yorkshire Ambulance Service, 2017).
Cognitive bias
Scenario
You attend a patient whom you have previously met on multiple occasions. The patient is known to suffer from panic attacks. The patient has difficulty with, but does eventually respond to, coaching. The patient always complains of muscular chest pain following the panic attacks, but says the pain feels different today. Once the patient is calm, you note they remain tachypnoeic and tachycardic. Regardless of this, you tell yourself that they suffer from severe anxiety and discharge them into the community. You later reflect on this incident, unsure if you made the correct decision.
Confirmation bias
Definition
Confirmation bias is a type of psychological bias which results in decisions or beliefs being formed based on a clinician's preconceptions, with evidence to the contrary being discounted (Nickerson, 1998). Clinicians tend to overvalue the importance of clinically irrelevant information, if it confirms their preconceptions. Conversely, clinicians will underestimate the importance of relevant disconfirming information.
The failure to adequately adjust probabilities and therefore approach as new disconfirming information becomes available is known as anchoring. This occurs concurrently with confirmation bias (Tversky and Kahneman, 1974). In the case example, the ‘new information’—altered observations and new chest pain—are discounted as irrelevant. This is because of the preconceived notion that this patient is ‘just anxious’.
A simulation study found that confirmation bias occurred in more than 75% of simulated anaesthesiology sessions (Stiegler et al, 2012). Tschan et al (2009) demonstrate that the issue of confirmation bias is not necessarily restricted to one's own internal thought process. They showed that confirmation bias can influence auditory perception when auscultating a patient's lung sounds. In this case, multiple clinicians suggested a unilateral air entry reduction on a standardised manikin with no diminished breath sounds.
Mitigation
Ogdie et al (2012) assert that reflective writing and narrative discussion are useful tools in developing avoidance strategies for diagnostic error caused by cognitive bias. These tools were used in conjunction with education about cognitive biases. The author would suggest that the paramedic profession already provides the tools to tackle this issue. As previously discussed, paramedic education is moving towards increased entry level education, and the teaching of human factors is incorporated into curriculum guidance for higher education institutions. Human factors training could easily be expanded to incorporate education about cognitive bias. Furthermore, it is a requirement of all paramedics to undertake continuing professional development (CPD) as part of maintaining a professional registration with the registering body (Health and Care Professions Council (HCPC). The HCPC (2017) advocates reflective practice within its CPD guidance. Therefore, paramedics who wish to mitigate cognitive bias in their practice are already in a system designed to aid them to achieve this. Some further examples of cognitive bias in paramedic practice are displayed in Table 1.
Type of bias | Explanation |
---|---|
Response bias | This occurs when participants (e.g. patients) answer differently from the way they actually feel. This may be due to the way questions are phrased or because they are unwilling to be honest (Nicholson, 2014) |
Inattentional blindness | Observers (e.g clinicians) fail to notice an apparently obvious event or occurrence when they are undertaking a challenging task (Neisser, 1979; Kreitz et al, 2015) |
Normalcy bias | The minimisation of warning signs as ‘minor’ or ‘normal’, in order to prevent overreaction and help cope with stressful events. Failure to respond appropriately to such warnings may result in a situation becoming dangerous (Murata et al, 2015) |
Automation bias | Placing a higher level of trust in erroneous data provided by automated machines (e.g. automatic blood pressure cuffs) than data provided by non-automated systems (Moriarty, 2015) |
Hindsight bias | The belief that an event is more predictable, after it becomes known, than before the outcome was known (Roese and Vohs, 2012) |
Availability heuristic
Definition
Bias can also occur as a result of the availability heuristic. This is where the probability of a future event occurring is measured by the ease of which a similar past event can be recalled (Tversky and Kahneman, 1974; Toft and Reynolds, 2005). This type of bias can be seen when the number of airline ticket sales decrease following a high-profile accident. Toft and Reynolds (2005) suggest that this is because probability of that event occurring subjectively increases in the mind of the passenger. Though, in reality, it is no more likely that there will be a further accident.
In relation to prehospital care, relying unduly on what one has witnessed previously, rather than carefully evaluating the patient's condition and objectively assessing the likely probabilities, the clinician risks making a biased decision. Linking this to the described scenario, it is easy to see how apparently concerning symptoms could be discounted as a result of this cognitive bias.
Mitigation
Redelmeier et al (2001) published a number of recommendations that apply to availability heuristic and its role in decision-making. Primarily, they suggested that clinicians consider whether missing information is relevant. Spending large amounts of time performing unnecessary tests may result in irrelevant test results altering patient treatment and distracting from the presenting complaint. They also suggest that clinicians plan in advance of data being available. In prehospital care, this may be as simple as a clinician deciding, ‘If this patient has a GCS [Glasgow Coma Score] of less than 14 following a head injury, I will convey them to a major trauma centre.’ In this way, the act of gathering data cannot affect the decision-making; rather, decisions are dictated by the data. Finally, the authors suggest that clinicians share decision-making with colleagues who can independently review the data, as they haven't been involved in the gathering process. The author would also suggest that clinicians should attempt to discount previous calls and treat each case individually as it presents itself.
Conclusion
Much of the evidence surrounding human factors and cognitive bias comes from the hospital or primary care environment. There appears to be little evidence looking specifically at human factors in the prehospital setting. The current article is not exhaustive; however, the author has tried to cover a range of human factors and cognitive biases pertinent to clinicians working in the prehospital environment, such as authority gradients, situational awareness, bandwidth, confirmation bias and availability heuristic.
The overarching theme throughout the author's research for this article was that an awareness of human factors and cognitive bias mitigated some of the risks these subjects pose. As such, the author hopes the article will inform clinicians about issues they face in practice, and help them to overcome these, to aid safe and effective clinical care.