Essay
This is an essay I wrote in order to gain an improved understanding of decision-making processes in clinical practice. Eventually, it taught me more than that: It also showed me how to change them.
Clinical decision-making: The value of knowing about the process and the difficulty of influencing it. The account of a final year osteopathic student.
1. An introduction: Why I think that over-thinking things may be the thing to do...
The following is a memo which emerged during reflection on the analytical process of my dissertation. This memo prompted me to include the aim to 'work on my decision-making processes' into my LPA contract.
29/09/2013
It seems to me that many of my notions about qualitative research and my own role in it have formed subconsciously without my being really aware of it. Of course, much is to do with my world view and I will have to make this explicit at some point. Reflective memo writing is essential to this and I am really getting into it, too. It means that you have to ask yourself questions and at the end of the day that's what you end up doing all day anyway: Answering questions and justifying your decisions. This justification gets so much easier when the process of decision-making was an active one: Type 2 decision making, based on cognitive evaluation rather than intuitive steps. No need to say that learning this will be beneficial in many aspects of life. It occurred to me earlier that one of the main tools in controlling decision-making processes is the active double-checking by reflection: Why would I have taken this decision now if it hadn't occurred to me to think about it first? It is a bit like member-checking, where the subconscious brain that has just sent a decision to the consciousness will have to step out of the shadows into the consciousness and file an explanatory account of how this decision came about. If that is done before the resulting action is actually taken, then it can still be checked if it is found to be biased. That leads the discussion towards the heart of the problem: We are all biased. Recognising this is probably step one. Step two, three, four and five are more difficult. You need to know which biases exist. You need to be able to observe yourself from a neutral outside perspective. You have to deal with your innate desires and things like pride, self-esteem and ambition. Also, practical issues such as lack of experience, naivety or simple lack of knowledge might get in the way. And if you have recognised a bias - then what to do about it? How to get rid of it? One could argue that you will have to change your personality to de-bias your way of thinking. But I think that thought-processes are just learned (acquired) like many other things, habits, if you wish. Or am I biased there? If I'm right, then you can change it - being a reasonable human being. OK, let's leave some space for passion etc., but in particular in a clinical or scientific context, rationality is appropriate. We will discuss the importance of empathy elsewhere. So much for a brainstorming introducing my perspective on decision-making processes.
2. Fundamental concepts and considerations
When I entered the forth year, my de-biasing strategies were mainly the ones that the BSO supplies you with. Nobody calls them 'de-biasing strategies', but essentially teachers suggest some of them to you. First, there are so-called 'forcing strategies', which can make you double-check on your thought process. The 'osteopathic sieve' gets you to consciously think of any tissue type potentially involved in the presenting problem; The 'VINDICATER' is an acronym for a number of pathological processes potentially accounting for the patient's symptoms. Secondly, clinical guidelines are probably the most common example for decision-making tools from standard health care. Guidelines provide algorithms as to how to proceed under certain circumstances. The key thing here is that they are evidence-based: People have made the mistakes before and written them down so that you don't have to do them yourself; or: trial and error on a large and controlled scale, eventually informing practice. Thirdly, and not officially part of the any syllabus, is of course your clinc tutors personal experience. Sentences like 'with a tight piriformis, I very often find these points to be tender as well...', or awareness of a tutor's particular preferences moist certainly lead students to consider them especially in their patient assessment or management plan.
All of the above have one thing in common: They influence your thinking. More accurately: They direct your thinking into a particular direction, 'guide' it and eventually make you come to a conclusion which may or may not be different from the one you would have come to anyway. Here we see the essential feature of all of the above tools: Subconscious decision-making is overridden, checked upon be conscious reflection with the aim to decrease the likelihood of making mistakes.
This again leads us to the discussion of 'mistakes'. In clinical practice, is any action that is inferior in outcome to another possible action a mistake? Or is it only a mistake it the actual outcome is harmful in nature? Here, the term 'best practice' deserves introducing: This notion comes back to the fact that today clinical approaches need to be tested before they can be implemented or recommended. That means that clinicians have access to research which ideally tells them that one thing or the other is more effective in certain circumstances. Thus, 'best practice' is what results from 'evidence-based' decision-making. 'Bias', however, is not quite the same as a 'mistake', just using high-pitched scientific language. Rather, a bias is what may eventually lead to a mistake being committed. It is a feature of the individual or system involved in making the actual decision. If I am a farmer and I plant my potatoes the way my grandpa planted them, I am probably 'status quo' biased: I plant them that way just because that's the way it's always been done, even though today there are probably more effective ways of doing it, using different machinery or some kind of fertiliser (Samuelson and Zeckhauser, 1988). You get the point. There are many forms of bias, the most common ones in health-care allegedly being confirmation, availability, representativeness etc. (Bornstein and Emler, 2001).
Bias essentially prevents you from being rational, from doing what would make sense. The aim of a professional has thus to be to decrease his or her 'biasedness', which, of course, requires previous identification of such liabilities (see below).
Now, does a very experienced doctor, osteopath or even a car mechanic consciously employ decision-making strategies when confronted with a problem, or are they more likely to just 'know'? Probably the latter. Why is that? Two options: Either they never cared and somehow got away with it – chances are they are rubbish -, or they did use them a lot when they were still less experienced and just got good at it; They developed what is termed 'expertise'. Expertise is what provides you with the ability to rely strongly on type-one decision-making processes, taken intuitively. Intuition is not just something that happens out of the blue, providing you with the correct answer like that, snap. It is the subconscious process of recognising a pattern in a given set of information and linking this pattern to previously made experience. If this experience is established well and – importantly – unbiased, pattern recognition should provide you with the correct answer in most circumstances.
You can easily a number of pitfalls here, though: Either, the pattern you think you recognised is actually not quite the same as the ones in the past. Or, the same pattern applies to numerous problems, one less common than the other, but requiring different problem-solving approaches. Also, your entire experience may simply be wrong and you've taken the wrong decision over and over again until is became your 'expertise'. The logical solution to this is twofold: One, the initial experience needs to be made carefully and correctly. Two: Even with a high level of expertise, certain sensitive security mechanisms need to be in place to check you when you are about to make an uninformed decision.
Since expertise is a long way off for me and not the point of this essay, the above discussion provides us with an important point nonetheless: Gaining experience as a novice anything, but health care professional in particular, requires a careful approach in order to make the 'right' decisions as often as possible and straight away so that the developing expertise is well-grounded. Mistakes are made, of course, but they need to be learned from effectively. Reflection is, over and over again, the key instrument to that.
3. Clinical practice – Osteopathy
With regard to clinical practice, bias-research from mainstream medicine distinguishes between bias affecting the diagnosis and bias influencing the choice of treatments (Bornstein and Emler, 2001). Osteopathy is special in that most of what we do stands on a very fragile evidence base, anyway. However, the process of making a clinical diagnosis and in particular the differential diagnosis could be seen to be comparable. Decisions are also taken when it comes to justifying an osteopathic evaluation in an exam setting, and specific strategies can thus also be employed here.
The strategies introduced above (cognitive forcing strategies, guidelines and anecdotal evidence / individual guidance) all have their merits. In order to further develop my own skill set, I would have to a) practise using such techniques and b) individualise them a little. Based on feedback from clinic tutors and from a formative clinical assessment in Autumn, I identified specific areas of weakness and sought to address them specifically using cognitive strategies.
4. Individual bias – promoting change
Identification of weaknesses is always a difficult task. One has to rely on honest feedback of others and be able to acknowledge and appreciate criticism. Even when identified, achieving an actual change in behaviour is even harder.
'The simplest approach to improving doctors’ decision making is to educate them about the existence of the biases, on the assumption that an awareness of the biases will permit them to avoid being influenced by them.' (Arnoult and Anderson, 1988)
Just from reflecting upon it, I found myself liable of a number of biases: During case history taking, I often found myself jumping to conclusions too quickly. 'Confirmation bias' then had me selectively gather evidence in favour of my initial conclusion (Bornstein and Emler, 2001)
whilst not keeping an open mind regarding other possibilities. In particular at the start of my final year I noticed that this happened and some tutors commented on it. Most of the time, this did not lead me to an incorrect diagnosis. It just meant that my differentials could have been a little more elaborated. Similarly, clinical observation and examination often produced the results anticipated. I did wonder how liable I was to finding, say a restricted joint if I previously thought that it might be involved. This latter is probably a a very common flaw of subjective palpatory experience (see my mate Niklas' dissertation on that one), the first flaw, however, I managed to control, soon. I think that most of the confirmation bias is due to the fact that it's simply easier to stick with a likely diagnosis rather than to continuously explore, evaluate and reassess numerous potential diagnoses. So I had to make myself do it. Essentially, it took awareness of my liability to confirmation bias, employment of the VINDICATER forcing strategy and a simple habit of note-taking during the case history that did the trick; As I am taking the case history, I am now jotting down potential differentials on a separate sheet of paper in a mind-map / brainstorming fashion. This allows me to get back to certain ideas whilst still exploring other areas and I won't have to worry about thoughts slipping my brain or distracting me. This way I improved my hypothetico-deductive reasoning, the important approach of a novice clinician to clinical reasoning, eventually building up to 'expertise', as discussed above (Thomson et al., 2011). Getting my fundamental differentials under control also allowed me to spend time and energy on exploring the broader patient context and to include more of it into my osteopathic evaluation. Thus, after the Autumn term, this was no longer an area of criticism. Similarly, I managed to become more 'osteopathic' in thinking; something that was also suggested to me before.
In preparation of my first clinical exam in March, I developed my own cognitive forcing tool, based on previous feedback. The following sentence won't make sense to anyone, but each word in it gets me to think of a particular area I should have paid attention to in the evaluation of a patient: 'How much more do you need in order to justify and differentiate from osteopathic concepts and what else?' Sounds random, I know. But my exam feedback says that I didn't fail to gather relevant information, justify my differentials appropriately or demonstrate integration of osteopathic principles, etc.
At this stage, let me digress from the discussion of bias and clinical reasoning a little, in order to discuss a more fundamental principle. The key to getting better at doing anything is not just practice, but practice while learning from mistakes.
Therefore, I spent a long time reflecting on clinical encounters, especially difficult ones. Every time I found myself say: 'Oh, I probably should have done this differently' – here it was, another mistake. I wrote down these accounts in a blog and on paper during my third year. At this point it is probably good to give an example: I treated a patient once who very quickly got a little overly-confident and familiar with me, thereby totally undermining my clinical authority. That is not to say that patients should be scared of me, God forbid, but a little respect... It got to the point that the patient was swearing in the treatment room as if he was in a pub. Not at me, but in my presence as if I was an old mate of his. On reflection, I realised that during our first encounter I was excessively keen to make him feel comfortable and gain his amiability rather than respect. Ever since, I am introducing myself to new patients with my full name and judge carefully how much personal chit-chat I want in the first session. This leaves some professional distance between me and the patient, which I can always decrease but which it is hard to re-gain once lost.
This example makes another important point: Clinical decision-making should not only be about pathology and medication; In an individualised health-care approach such as osteopathy it is as much about patient-practitioner interaction as it is about 'classical' medical decisions.
5. Conclusion
If nothing else, this essay achieved one thing: It sharpened my awareness of the fact that bias exists and that we are all biased. It got me to put my thoughts into words and will contribute to promoting change in my actions. As we say in German: Einsicht ist der erste Schritt zur Besserung – verbatim: Insight is the first step towards improvement.
References
Arnoult, L.H., Anderson, C.A., 1988. Identifying and reducing causal reasoning biases in clinical practice, in: Turk, D.C., Salovey, P. (Eds.), Reasoning, Inference, and Judgment in Clinical Psychology. Free Press, New York, NY, US, pp. 209–232.
Bornstein, B.H., Emler, A.C., 2001. Rationality in medical decision making: a review of the literature on doctors’ decision-making biases. J. Eval. Clin. Pract. 7, 97–107. doi:10.1046/j.1365-2753.2001.00284.x
Samuelson, W., Zeckhauser, R., 1988. Status quo bias in decision making. J. Risk Uncertain. 1, 7–59. doi:10.1007/BF00055564
Thomson, O.P., Petty, N.J., Moore, A.P., 2011. Clinical reasoning in osteopathy – More than just principles? Int. J. Osteopath. Med. 14, 71–76. doi:10.1016/j.ijosm.2010.11.003