References

Beer L, Golub L, Smith DT. Diagnostic reasoning and decision making, 2nd edn. In: McKean SC, Ross JJ, Dressler DD, Scheurer D. New York (NY): McGraw Hill; 2017

Bellomo R, Kellum JA, Ronco C. Acute kidney injury. Lancet. 2012; 380:(9843)756-766 https://doi.org/10.1016/S0140-6736(11)61454-2

Beynon M, Curry B, Morgan P. The Dempster-Shafer theory of evidence: an alternative approach to multicriteria decision modelling. Omega. 2000; 28:(1)37-50 https://doi.org/10.1016/S0305-0483(99)00033-X

Chen L, Zhoua X, Xiaoa F, Dengab Y, Mahadevan S. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis. Nuclear Engineering and Technology. 2017; 49:(1)123-133 https://doi.org/10.1016/j.net.2016.10.003

Ciabattoni A, Picado Muiñob D, Vetterlein C, El-Zekeyd M. Formal approaches to rule-based systems in medicine: the case of CADIAG-2. International Journal of Approximate Reasoning. 2013; 54:(1)132-148 https://doi.org/10.1016/j.ijar.2012.09.002

Croskerry P. The theory and practice of clinical decision-making. Can J Anesth. 2005; 52:R1-R8 https://doi.org/10.1007/BF03023077

Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009; 84:(8)1022-1028 https://doi.org/10.1097/ACM.0b013e3181ace703

Cruz DN, Ricci Z, Ronco C. Clinical review: RIFLE and AKIN—time for reappraisal. Crit Care. 2009; 13:(3) https://doi.org/10.1186/cc7759

Gigerenzer G, Gaissmaier W. Heuristic decision making. Annu Rev Psychol. 2011; 62:451-482 https://doi.org/10.1146/annurev-psych-120709-145346

Harmon-Jones E. Cognitive Dissonance, 2nd edn. Washington (DC): American Psychological Association; 2019

Kahneman D. Thinking, fast and slow.New York (NY): Farrar, Straus and Giroux; 2011

Medow MA, Lucey CR. A qualitative approach to Bayes' theorem. Evid Based Med. 2011; 16:(6)163-167 https://doi.org/10.1136/ebm-2011-0007

Nolan J, Soar J, Hampshire S Advanced Life Support, 7th edn. London: Resuscitation Council (UK); 2016

Situational awareness and patient safety. 2011. https//tinyurl.com/4y2bmj4u (accessed 23 July 2021)

Portis AJ, Sundaram CP. Diagnosis and initial management of kidney stones. Am Fam Physician. 2001; 63:(7)1329-1338

Sox H, Higgins M, Owens D. Medical decision making, 2nd edn. Chichester: Wiley Blackwell; 2013

Tay SW, Ryan P, Ryan CA. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students. Can Med Educ J. 2016; 7:(2)e97-e103 https://doi.org/10.36834/cmej.36777

Zambas SI, Smythe EA, Koziol-Mclain J. The consequences of using advanced physical assessment skills in medical and surgical nursing: a hermeneutic pragmatic study. Int J Qual Stud Health Well-being. 2016; 11 https://doi.org/10.3402/qhw.v11.32090

A decision theory overview and case-based discussion

02 August 2021
Volume 13 · Issue 8

Abstract

Paramedics make decisions as part of their everyday role but often, the theory behind clinical decision-making is not discussed in depth. This article explores the theories of decision-making as they apply to a clinical case. With the increasing use of technology in healthcare, the introduction of human reliability analysis is becoming more pertinent.

Practitioners use decision theory in everyday practice. However, little consideration is given to the processes used to arrive at a diagnosis. This article will briefly explore the theory behind decision-making in paramedic practice. A hypothetical but realistic case, set in a hospital, will be used to show how the theory is applied in practice.

Background

The example given in this article uses a hospital inpatient to discuss the difficulties in diagnosing those with undifferentiated pathology, but the themes relate directly to prehospital care. The rationale for choosing an inpatient is for a greater depth of understanding and to showcase the roles in which a paramedic may find themselves working with a need to understand decision theory in greater detail.

An elderly person is an inpatient on an acute medical ward following an unplanned admission with systemic infection against a background of medication non-compliance. The patient had been treated on the ward for more than a week before there was an acute change in their presentation, and ward staff noted a stepwise decrement in the patient's consciousness levels. The patient's Glasgow Coma Score had steadily declined over the course of the day, with a single hypoglycaemic episode earlier in the day having been successfully treated. A review of the notes elicits a new diagnosis of acute kidney injury (AKI) on day 5 of admission.

Type 1 and type 2 thinking

Type 1 thinking leads to a rapid and intuitive decision based on readily available information and previous experiences (Croskerry, 2009; Kahneman, 2011). This may be referred to as the pattern-recognition thought process where previous experiences may skew the thought process into a decision or diagnosis based on past patient presentations (Gigerenzer and Gaissmaier, 2011; Kahneman, 2011; Sox et al, 2013).

Type 1 thinking is usually a subconscious, instinctive, ‘I see therefore it is’ approach. This thought process may be beneficial to clinicians when a rapid decision needs to be made and acted upon, for example recognising a patient in cardiac arrest.

Type 2 thinking is a slower, considered and conscious process that requires more input from the clinician (Croskerry, 2009; Kahneman, 2011). It is inquisitive, and not only dynamically questions the information to hand but also actively seeks further evidence to inform the decision (Gigerenzer and Gaissmaier, 2011; Kahneman, 2011; Sox et al. 2013). This may produce a more analytical, ‘is what I am seeing what it really is?’ approach. This method is often used by clinicians when faced with more complex presentations, those that do not neatly fit into a previous ‘box’ of diagnosis, or when they are seeking a diagnosis safety net by double-checking their personal bias (Kahneman, 2011; Jones, 2017).

When considering the case in question, an appropriate type 1 thought process might be to consider hypoglycaemia as the primary diagnosis. The patient previously had low blood sugars during the day and has now become unresponsive; therefore, the likelihood of hypoglycaemia is high.

Paramedics make decisions in everyday practice but how is decision theory applied to clinical cases to make diagnoses?

However, the type 2 thought process would consider that there is no mention of diabetes in the medical history, only an isolated hypoglycaemic episode likely related to the systemic infection. Furthermore, hypoglycaemia would not typically present with a slow downward trend in consciousness over a period of hours.

An in-depth assessment and review of the notes for this patient would elicit key information on the AKI, pertinently a worsening renal picture with a rising serum urea level likely resulting in encephalopathy. Therefore, considering the case in question, a reflexive, biased decision may have led to treatment the patient did not need as well as a missed diagnosis.

Cognitive aids are available to assist the clinician in avoiding a type 1 runaway and force them to employ a type 2 process. The most common method is a cognitive reflection test (CRT) (Tay et al, 2016), also known as a cognitive pause. This is where the clinician—or any member of the team—takes a conscious and, typically, vocalised approach to question their initial reflexive thought process (Tay et al, 2016). This can be as simple as asking ‘am I sure this is the correct course of action—could it be anything else?’ The CRT can be done individually or involve the team; accessing the combined thought processes of a wider team can enhance situational awareness (Parush et al, 2011).

In paramedicine, the wider team can include your crewmate, the public, other emergency services and agencies. and even the patient. Shared processing and mental modelling is also important when delivering or requesting remote clinical advice such as a solo responding paramedic calling a clinical advice line or a GP. As roles expand into advanced practice, the team changes but rarely is a clinician truly alone.

Combining both type 1 and type 2 processes is described in the work by Croskerry (2005) as dual processing. The theory of dual processing works to provide a bridge between the two processes, using the merits of each one while assimilating pertinent data in a rapid and timely fashion to facilitate healthcare in a time-constrained environment (Croskerry, 2005).

Key to paramedic practice is appreciating that the nature of emergency and prehospital care will sometimes rely on a type 1 process while noting the limitations of basing the decision solely on a reactive assessment.

Bayes' theorem and Dempster-Shafer theory

After the clinician has gone through the initial thought processes described, additional processes can be applied as further investigation is undertaken.

In this case, during further examination, it is established that the patient has a normal blood glucose reading, is apyrexial and has presented with a slow neurological decline over the course of some hours. This would decrease the probability of acute hypoglycaemia, sepsis and stroke; however, these had not been fully excluded at this point.

The Dempster-Shafer theory (DST) arguably describes a more robust measure of probability in diagnostic decision-making than the traditional Bayes' theorem (Beynon et al, 2000; Gigerenzer et al, 2011). DST allows for the probability of a single ‘assignment’ (diagnosis) to be considered concurrently with the group probability of a selection of diagnoses (Beynon et al, 2000).

In essence, it does not rely on individual diagnosis but on a group of diseases considered together; for example, a neurological event, as opposed to an ischaemic stroke, haemorrhagic stroke, infection and transient ischaemic event individually. This explains the process by which a clinician will weigh the probability of a group of diseases as a whole to narrow the diagnosis to the most likely system.

Bayes' theorem uses a clinician's diagnostic insight to provide a probability weighting to the individual diagnoses (Medow and Lucey, 2011; Ciabattoni et al, 2013). This is likely to narrow the list of differentials quicker than DST, but there is a risk of increased bias if a potential diagnosis is shown to be more likely at a single point during physical examination findings to the exclusion of other possibilities (Medow and Lucey, 2011; Ciabattoni et al, 2013). For example, had the blood glucose been low or marginal, this may have led the clinician to negate reviewing the biochemistry or seeking further differential diagnoses.

DST is able to incorporate quantitative data from physiological observations and binary and non-binary responses to questioning, in addition to ‘fuzzy logic’ principles of the clinician's previous experience and exposure (Ciabattoni et al, 2013). These variables are considered together in a non-linear, weighted probability scale (Ciabattoni et al, 2013), which means that each is considered on its own merits and not weighted by a pre-determined scale. The variables (diagnoses) are considered concurrently until the final decision point, reducing the risk of bias.

Therefore, DST is more akin to human diagnosis, less restrictive than typical Bayesian logic and can give a weighted probability to a group of diagnoses at once. In this case, the most likely cause of the patient's condition was a reduction in their renal function secondary to an AKI. However, AKI is a collective term for renal injury and can be subdivided into pre-renal, intrinsic and post-renal causes (Portis et al, 2001; Cruz et al, 2009; Bellomo et al, 2012).

Human reliability analysis

Systematic examination is the gold standard of assessing patients with an acute deterioration (Nolan et al, 2016). However, clinicians regularly feel pressured into undertaking focused examination because of time pressures and an increased risk of wrong diagnosis by non-physician practitioners (Zambas et al, 2016). Tight time constraints may lead to human error when misinterpreting data that are presented to them (Zambas et al, 2016), such as observations or laboratory results. The risk is further increased if there is a human-technology interface such as when electronic patient monitoring or computer devices are used (Chen et al, 2017).

Human reliability analysis (HRA) is a subsection of decision-making mathematical theory used to describe human-technological interfaces. HRA applies where the risk of error is high and the outcome has high-impact complications (Chen et al, 2017), so its principles are appropriate for medical diagnosis even though it is not commonly associated with healthcare.

HRA identifies events of human error and evaluates the influence of that error preceding a task failure. This is measured as end-point failure rates, which derive from a conditional human error probability (Chen et al, 2017). In healthcare, this translates as misdiagnosis from misinterpreting or not gathering all the available data from the physical examination, observations or diagnostic technology. Minimising failure points will, in turn, reduce negative endpoint measures.

In practice, this can be achieved by a thorough and systematic examination combined with a cognitive pause when noting physiological parameters and applying them to the physiology observed.

In the case described herein, it could be easy to select the wrong patient from a long list on the electronic systems, which could lead to the wrong blood results or observations being interpreted.

In addition, when looking at the observations, even if the correct patient is selected, there is a risk of missing vital information and missing a trend if only individual observations are seen.

In paramedic practice, the use of electronic records is increasing and the ability to view previous paperwork can lead to error if the wrong patient identifier is inputted and the incorrect information is therefore used.

Diagnostic reasoning

The processes a clinician can use to make a diagnosis depends on their experience, and how this is applied with their knowledge of the specific patient, the condition and treatments available (Beer et al, 2017). The theory of diagnostic reasoning recognises that the clinician's experience will help them to make a decision in carrying out an examination and requesting investigations.

However, the application of a diagnostic reasoning method implies that the clinician is able to combine their skills, knowledge and reasoning ability to make a diagnosis (Beer et al, 2017). Furthermore, it should be reiterated that this method provides a working diagnosis that may not be accurate until proven definitively—something that is unlikely to be available to prehospital clinicians. This method becomes flawed if the patient's condition is such that it falls outside the clinician's knowledge or skills and the clinician fails in their reasoning to identify their knowledge gap (Beer et al, 2017).

It should be noted that if a clinician employs diagnostic reasoning with no other methods of decision theory, the risk of misdiagnosis is in the range of 20–40% (Croskerry, 2009). Clinicians should thus employ all methods available to ensure they do not misdiagnose a patient. A practitioner's past experiences will influence the decisions they make through conscious and unconscious biases (Croskerry, 2005); this may be influenced by factors in the media as well as patient factors.

Increasing the clinician's knowledge will decrease the probability for error in a greater way than using a rule-based or practical approach (Croskerry, 2005; Beer et al, 2017).

In paramedic practice, external bias may come from recent reading or the wider media including the news. If a clinicians' awareness of a particular illness is heightened because of external influences, they are more likely to arrive at that diagnosis to the exclusion of others (Croskerry 2005; 2009; Beer et al, 2017).

As a paramedic, it is important to not only be aware of any gaps in knowledge but also to actively seek more data to aid in diagnosis, physical examination and physiological parameters, which can help reduce the risk of misdiagnosis. A working diagnosis should be treated as a hypothesis until proven either likely or unlikely by further investigation and identifying both normal and abnormal findings.

Regarding the case described here, although the clinician's specific knowledge of the underlying condition may not be extensive, applying the clinical and diagnostic reasoning to the observations and biochemical markers may lead to a working diagnosis. However, this required the clinician to employ the dual processing approach to activate the type 2 processes described.

Outcome

The patient receives treatment for a uraemic encephalopathy and recovers well before being discharged. Had the clinician gone with a type 1 biased approach, this condition may not have been detected until later in the patient journey.

Conclusion

This article gives a brief overview of the models of decision theory that health professionals can use in practice.

Clinicians should understand their type 1 process bias and develop methods to incorporate type 2 and further reasoning methods.

With increasing amounts of technology being used in prehospital practice, clinicians should be aware of the theory of human reliability analysis and seek to mitigate the risk of human-technological interfaces in healthcare.

Further research should investigate the impact of the increase in technology in prehospital care.

Key points

  • Type 1 processing is useful for immediate threats to life but diagnosis should employ a type 2 process
  • Including cognitive pause to your decision process will encourage the use of a type 2 process
  • Paramedics should understand that sources of external bias, including the media, can affect decisions
  • Clinicians should be aware of any gaps in their knowledge gaps and how this might affect diagnosis
  • CPD Reflection Questions

  • How might you prevent a type 1 process runaway?
  • Why might Dempster-Shafer theory be more akin to human diagnostic processing?
  • Why is human reliability analysis important in paramedic practice?