horses-1268691_1920

How do we know what we don’t know, especially if we’re sure that we know enough?

Overconfidence when making clinical decisions is common in doctors, and particularly as we become more experienced clinicians.  As GPs we see the most undifferentiated of presentations and we use one of two Systems in our brain to define the diagnosis:

1.  System 1:  Recognise the diagnosis and treat it.  e.g. Emergencies

  • This is an intuitive system where we use heuristics (mental short-cuts), rules of thumb, and diagnoses ‘matched’ with out memory and learned examples (which may be inadequate or incorrect).  It is fast, associative, inductive, frugal, and has a large affective (‘gut’)  component that is highly related to context.    There is minimal uncertainty.  But how do we know when it isn’t going to work?
  • The problem with System 1 is sometimes we don’t recognise when it’s not going to work.  We are also at risk of going with our ‘gut’ despite the evidence.  The solution is to have some decision-making defaults to backup this diagnostic method.

2.  System 2 Recognise that the problem lies in a ‘domain’ of diagnoses and needs further evaluation (i.e. examination and investigation) to work it out.   e.g. complex and chronic disease.

  • This system is analytical and slow, rational, rule-based and low in emotional investment.  It depends on context.
  • There is more uncertainty in this path, and when we’re uncertain we tend to go with our ‘gut’ – System 1.

Diagnostic failure reflects diagnostic uncertainty (which is more likely in undifferentiated presentations) and is therefore highest in GP, Emergency and internal medicine.  Our specialist colleagues (outside of GP) deal with much less uncertainty (excepting perhaps in ICU, trauma, surgery and internal medicine).  Good doctors need a combination of sound judgement, well-calibrated decision making and effective problem solving to reach a diagnosis.  This requires an awareness of which ‘system’ we are using (i.e. if when we may have our clinical blinkers on) because if there is an unrecognised change in context and we defer to pattern-recognising without thinking – we’ll miss something.

Good clinicians use System 1 and System 2 approaches and innately flick between them to make a diagnosis.

So how does these systems fit with overconfidence?  Sometimes it is the cases that you are most sure about that you are most shocked by.  That unexpected investigation result – is that because the history or examination was not thorough enough?  Were you so confident in your diagnosis that you did not consider all of the relevant differentials?  Or was it that you fell victim to anchoring or availability bias that have been discussed in previous blogs?  This week’s clinical reasoning challenge will explore all of these biases.  Feel free to engage and comment on the case on my Facebook page.

BiasesAs doctors, overconfidence sits well with us.   We like it as it leads to a decision.  Overconfidence fits well with System 1 emotive thinking as positive feelings increase confidence.  It also fits well with confirmation bias – we make a diagnosis, gather information to support it, and then feel more confident!  It is much easier mentally to be certain.    If we are overconfident, we become biased in the way that we gather information to support a hypothesis.  Sadly, the culture of medicine does little to support less confident doctors that are often seen as vulnerable – thus perpetuating the persona of the confident doctor.     Other variables that impact upon confidence include ego bias, gender, culture, personal ability, level of task difficulty, availability bias, outcome predictability, and attribution bias.

BUT overconfidence is related to unconscious incompetence – what we don’t know we don’t know.  So we need to develop strategies for DEBIASING – forcing us to consider alternative diagnoses  and opposing strategies.   Unfortunately one debiasing strategy the current generation has developed is online access (see article).   This is again a trap for young players as searching in itself is often based on confirmation bias.

So how do we debias?
1.  Recognise our delayed and missed diagnoses and consider what biases may have occurred.
2.  Consider the factors that contribute to bias at an individual and task level and how these factors might interact.
3.  Give the competing hypothesis as much attention as the presumed one.
4.  Take a look through Johari’s window – reflect on your decision-making.

Johari's window

‘Dr Johari’ is a wise and learned colleague who sits in the consultation room next door to you.  He/she has a one way window into your room and can observe and hear your consultations.   Ask ‘Dr Johari’ what he sees in the green window

I challenge you this week to:

1.  Choose a seemingly ‘straightforward’ case – give the competing diagnosis as much attention as the presumed one.  Ask yourself on what basis you used system 1 thinking?

2.  Ask a colleague to observe a consultation – what did they see through the green window?

3.  Consider your last unexpected diagnosis – why was it missed?

Next week we’ll be exploring risk tolerance – an important partner in crime to overconfidence!

References:

Akresh-Gonzales, J. (2016). Overconfidene Bias:  Is Online Access Your Cognitive Prosthesis? In: NEJM Knowledge+.

Croskerry, P. (2015). 50 Cognitive and Affective Biases in Medicine. In. Dalhousie University: Critical Thinking Program.

Norman, G. R., Monteiro Sd Fau – Sherbino, J., Sherbino J Fau – Ilgen, J. S., Ilgen Js Fau – Schmidt, H. G., Schmidt Hg Fau – Mamede, S., & Mamede, S. The Causes of Errors in Clinical Reasoning: Cognitive Biases, Knowledge Deficits, and Dual Process Thinking. (1938-808X (Electronic)).

Rubio, E. (2015). Increasing your self-awarness:  The Johari Window.  Retrieved from https://www.linkedin.com/pulse/increasing-your-self-awareness-johari-window-enrique

3 thoughts on “Removing your clinical blinkers”

  1. Pingback: How much risk do you tolerate? | Medical Education Experts

  2. Pingback: Clinical reasoning and the perfect CAATCHHH | Medical Education Experts

  3. Pingback: Family feud, metacognition, mud and the KFP | Medical Education Experts

Leave a Reply