Content Protection by DMCA.com

It is a wonderful experience when you are listening to a speaker and the ‘switch flicks on’.  It might be the way they are speaking, the knowledge they are conveying, and/or the contribution of other participants that connects all the circuits together to light up your cognition.   I had such an experience this week during a session on Clinical Reasoning by the inspiring Dr Genevieve Yates.

We were discussing how to unpack clinical reasoning, particularly with regards to assisting doctors to improve their clinical reasoning approach (and for many, their approach to the Key Features Paper).  The approach to this exam is often compared to the decision-making required to climb a tree to ‘reach the diagnosis’, where we start at the broad-based trunk, and are required to make decisions based on the prominent ‘key-features’ at the intersection of each branch.  This is diagnostic reasoning.   The exam (and our clinical practice), also includes therapeutic reasoning – the decisions we make around treatment and management that are often informed by additional information the non-clinical domains of general practice.  What struck me as we spoke was why so many doctors have difficulty with this exam – trees don’t grow in isolation.

Capture.PNG

Diagnostic and therapeutic trees (c.f. decision-making) exist in specific environments:

  • Is it sunny or shady? – what biases are impacting on how we see the diagnosis?
  • Who is watering the tree? – on what knowledge foundations are we reasoning?
  • Are you familiar with the species of tree? – have you had to climb this tree before?
  • What other organisms co-exist or form the micro-ecosystem of the tree? – the context of the presentation in this person and the community.

We use these clues in our day to day practice to inform our reasoning – but on paper (as in an exam) the ecosystem can be harder to define. We are unaware of the environment in which the patient is presenting.  We don’t have their past history or that ‘spidy sense’.   As GP’s we see the most undifferentiated of presentations and we use one of two Systems in our brain to define the diagnosis.  Understanding these systems can help to refine our clinical reasoning approach.

System 1:  Recognise the diagnosis and treat it.  e.g. Emergencies

  • This is an intuitive system where we use heuristics (mental short-cuts), rules of thumb, and diagnoses ‘matched’ without memory and learned examples (which may be inadequate or incorrect).  It is fast, associative, inductive, frugal, and has a large affective (‘gut’) component that is highly related to context.    There is minimal uncertainty.  But how do we know when it isn’t going to work?
  • The problem with System 1 is sometimes we don’t recognise when it’s not going to work.  We are also at risk of going with our ‘gut’ despite the evidence.  The solution is to have some decision-making defaults to backup this diagnostic method.
  • Such defaults include ‘surgical sieves’ and diagnostic algorithms such as Murtagh’s PROMPT.

System 2 Recognise that the problem lies in a ‘domain’ of diagnoses and needs further evaluation (i.e. examination and investigation) to work it out.   e.g. complex and chronic disease.

  • This system is analytical and slow, rational, rule-based and low in emotional investment.  It depends on context.
  • There is more uncertainty in this path, and when we’re uncertain we tend to go with our ‘gut’ – System 1.

Diagnostic failure reflects diagnostic uncertainty (which is more likely in undifferentiated presentations) and is therefore highest in GP, Emergency and internal medicine.  Our specialist colleagues (outside of GP) deal with much less uncertainty (excepting perhaps in ICU, trauma, surgery and internal medicine).  Good doctors need a combination of sound judgement, well-calibrated decision making and effective problem solving to reach a diagnosis.  This requires an awareness of which ‘system’ we are using because if there is an unrecognised change in context and we defer to pattern-recognising without thinking – we’ll miss something.

Good clinicians use System 1 and System 2 approaches and innately flick between them to make a diagnosis.

It is often the ‘consultation climate’ that triggers the system switch:

  • what the patient looks like;
  • what is common in our community;
  • our prior knowledge of the patient;
  • our gut feel.

These aforementioned aspects are absent when the case is ‘on paper’.

So how do these systems fit with diagnostic error?  Sometimes it is the cases that you are most sure about that you are most shocked by.  That unexpected investigation result – is that because the history or examination was not thorough enough?  Were you so confident in your diagnosis that you did not consider all the relevant differentials?

As doctors, overconfidence sits well with us.   We like it as it leads to a decision.  Overconfidence fits well with System 1 emotive thinking as positive feelings increase confidence.  It also fits well with confirmation bias – we make a diagnosis, gather information to support it, and then feel more confident!  It is much easier mentally to be certain.    If we are overconfident, we become biased in the way that we gather information to support a hypothesis.  Sadly, the culture of medicine does little to support less confident doctors that are often seen as vulnerable – thus perpetuating the persona of the confident doctor.     Other variables that impact upon confidence include ego bias, gender, culture, personal ability, level of task difficulty, availability bias, outcome predictability, and attribution bias.

BUT overconfidence is related to unconscious incompetence – what we don’t know we don’t know.  So we need to develop strategies for DEBIASING – forcing us to consider alternative diagnoses and opposing strategies.   Unfortunately one debiasing strategy the current generation has developed is online access (see article).   This is again a trap for young players as searching in itself is often based on confirmation bias.

So how do we de-bias?
1.  Recognise our delayed and missed diagnoses and consider what biases may have occurred.
2.  Consider the factors that contribute to bias at an individual and task level and how these factors might interact.
3.  Give the competing hypothesis as much attention as the presumed one.
4.  Take a look through Johari’s window – reflect on your decision-making.

Johari's window

‘Dr Johari’ is a wise and learned colleague who sits in the consultation room next door to you.  He/she has a one-way window into your room and can observe and hear your consultations.   Ask ‘Dr Johari’ what he sees in the green window.

I challenge you this week to:

  1. Choose a seemingly ‘straightforward’ case – give the competing diagnosis as much attention as the presumed one.  Ask yourself on what basis you used System 1 thinking?
  2. Ask a colleague to observe a consultation – what did they see through the green window?
  3. Consider your last unexpected diagnosis – why was it missed?
  4. List the ‘taken for granted’ aspects of a consultation that you use to make diagnostic and therapeutic decisions – define you’re the ecosystem of your practice. By understanding this, you will come to understand the assumptions that you might make during your clinical reasoning.

For those who’s lives are currently overwhelmed by KFP study, considering the importance of your diagnostic ecosystem will help you to understand the importance of specific answers, demonstrating the breadth of your knowledge and safety in practice.   The assessment is about demonstrating how well you can climb the tree, and not relying on whether it was a bright sunny day when you went to work that morning, and only the regular trees would need climbing.

But do remember that the tree is part of a forest – use the clues and cues to ensure you can appreciate the view.

For more insights into clinical reasoning and exam approaches, our resources can provide additional information.

References:

Akresh-Gonzales, J. (2016). Overconfidene Bias:  Is Online Access Your Cognitive Prosthesis? In: NEJM Knowledge+.

Croskerry, P. (2015). 50 Cognitive and Affective Biases in Medicine. In. Dalhousie University: Critical Thinking Program.

Norman, G. R., Monteiro Sd Fau – Sherbino, J., Sherbino J Fau – Ilgen, J. S., Ilgen Js Fau – Schmidt, H. G., Schmidt Hg Fau – Mamede, S., & Mamede, S. The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking. (1938-808X (Electronic)).

Rubio, E. (2015). Increasing your self-awarness:  The Johari Window.  Retrieved from https://www.linkedin.com/pulse/increasing-your-self-awareness-johari-window-enrique

Please feel free to comment.