AAOS Now

Published 6/1/2020

Patient Cognitive Error and Bias Can Impact Safety and Health

Editor’s note: This article is a two-part series based on a roundtable discussion of cognitive error and bias. The second installment will appear in the July issue of AAOS Now.

David C. Ring, MD, PhD, FAAOS, led a roundtable discussion to address patient cognitive error and bias and how they can impact illness, such as increasing the intensity of symptoms and creating a greater magnitude of limitations.

Dr. Ring: The mind is designed to take shortcuts. The ability to make snap judgments and fall back on our intuition is useful in times of danger. In summarizing their Nobel Prize-winning work, Kahneman and Taversky labeled this type as thinking system 1. It’s quick and easy. It takes no effort and expends no energy. It is also often incorrect. Mental shortcuts can lead to errors. Psychologists refer to mental shortcuts as cognitive bias and the resulting erroneous or unhelpful thoughts as cognitive errors. Pilots and surgeons complete checklists because our minds are prone to these types of errors. When I did a carpal tunnel release when I was meant to do a trigger finger release, it was because a series of events biased my mind toward the former, leading to the error. Now I sign my site, cut through my initials, and go through a checklist just prior to incision.

As part of a quality-improvement project, I reviewed my first 200 visits in my new setting and estimated that in about 85 percent of those visits, I could have guided people through at least one important misconception. A common one was that hurt means harm. Another one was that recovery was taking too long.

People who develop and study decision aids and other strategies for gentle correction of common misconceptions have started to frame these cognitive errors or cognitive biases as a quality and safety issue: They talk about the risk of misidentification of patient preferences.

The idea is that you can’t just listen to a person’s stated preference and assume it is consistent with what matters most to them (their values) because they may not have reflected on their values and may not be aware of the misconceptions that are coloring their stated preferences. That’s a matter of cognitive bias.

As an example, most people have a core value that they would like to avoid injury, the strategic injury of surgery included. Nevertheless, when experiencing pain, crepitation, deformity, and radiographic displacement after a clavicle shaft fracture, it may seem like surgery is required. But nonoperative treatment is a good option. Distal biceps rupture is similar. People often believe they need a biceps repair so they can flex their elbow, not knowing that they have a brachialis muscle, which performs that task well. The stated preference is to fix it, but their value is to avoid surgery if possible. If we act on their stated preferences without encouraging them to reflect on their values and without correcting common misconceptions, then we might mis­diagnose their true preferences.

Nina R. Lightdale-Miric, MD, FAAOS: I notice this in my work with people who have congenital hand differences. The patient’s perspective may sometimes be, “I don’t want to have the surgery; I just don’t want to have this congenital hand difference.” Which can then lead to, “You told me you would fix it” when it does not look like a typical hand after surgery. The patient may be biased to see us as a fixer: “I asked you to fix it.”

Dr. Ring: The stated preference might be to not have this injury or disease, and that is not an option. The values that underlie that stated preference can create reasonable options. For instance, “I would like my right hand to look a little more like my left,” or “I would like this to be less noticeable,” or “I would like to be able to do this task a little bit easier or more comfortably.”

With this type of achievable goal established, we can focus on the probability that a certain intervention, exercise, injection, or surgery can achieve that goal. Or perhaps achieving those goals is possible through adaptation, resiliency, and reorientation of misconceptions. We may often discount the impact of less stress in our life, less distress, and healthier thoughts about our condition.

William Shaffer, MD, FAAOS: Part of the problem is that the surgeon may feel pressured to make these decisions as if we were in an emergency situation.

Dr. Lightdale-Miric: Fix now; not just fix, but fix now. I don’t feel fixed; maybe another doctor would have done a better job fixing me.

Dr. Shaffer: When I was still doing spine surgery, I was intentional and strategic about getting to know my patients over time before I ever talked about discretionary surgery. There is no way I can get to know a patient in one meeting, and if you are going to take care of something electively, you need that relationship. When I was in practice, I had two styles: the “immediate decision” style for the quadriplegic with an unstable cervical spine and the “let us do this over time” style for elective surgeries.

Todd S. Kim, MD, FAAOS: I appreciate the distinction we are making between a patient’s initially voiced preference for treatment versus what their values are. I think the question is, “How do you get to those values?” We have communication tools to try to do that, but those are not perfect. Instead of asking, “Do you want surgery?” one could ask, “What’s most important to you?”

Amy Franta, MD, FAAOS: When people come in with a misconception or bias, where does that come from? Where are they forming these opinions? Is it something to do with culture? Is it social media and online blogs that tell them they need to have their clavicle fixed?

Dr. Ring: I think a part is the normal function of the human mind. Under the influence of pain or injury, your mind may be programmed to think, “I need to act.” I would add that marketing can also add to these misconceptions. We often tell people that they need us, and we medicalize and obfuscate many things. People recovering from a fracture may ask, “Do I need physical therapy?” instead of “What can I do to improve the stiffness?” This may be medicalizing of the recovery process, most of which occurs naturally over time as people get back into their routines. My concern is that medicalizing it is part of commoditizing it and making it something that you have to pay for. It is portrayed as complex and difficult—something that you’re not expert enough to do on your own—which is often untrue.

Joel Mayerson, MD, FAAOS: An example of cognitive bias in my orthopaedic oncologic surgery practice is rotationplasty for childhood lower-
extremity sarcomas. The idea and look of rotationplasty may be disconcerting to parents. Once they see a video of a happy, active child after this surgery, they may change their minds.

Dr. Lightdale-Miric: Another example is fear of surgery, especially if you have a family member who died during surgery or had a complication from surgery.

Dr. Mayerson: We sometimes encounter the idea that if you have surgery, the cancer will spread. This idea may arise from the fact that, in the past, we didn’t have CT scans, so we didn’t know the extent of the disease before surgery. Prior to modern cross-sectional imaging, people often succumbed quickly after surgery because they already had advanced disease.

Dr. Franta: An assumption of shared decision-making is that the patient is going to fully comprehend their options and fully weigh the risks and benefits that you’re presenting to them, and then they will make a rational decision.

Dr. Shaffer: They may not be making an entirely rational decision. You don’t know the mindset of the patient.

Dr. Mayerson: I think there’s also a misconception that people adequately understand. There are issues of literacy, health literacy, and aptitude. We are not always well trained to guide people through those aspects of the patient education process.

Dr. Kim: It can be useful to explore the patient’s understanding of the problem. If I’ve seen the X-ray before I walked in the room, I know what the diagnosis is, and it’s tempting to tell them what they have and what they need without asking, “What’s your understanding of what’s going on?”

Dr. Mayerson: Often there is an anchoring bias on what a patient was told by their referring doctor. Trying to convince a person that he or she needs care that might contradict what their doctor of 30 years might have discussed with them can be incredibly challenging.

Dr. Ring: For those of us who take care of people who came through the emergency room (ER), there’s always some unwinding that you have to do: “They told me I need surgery.”

Julie Balch Samora, MD, PhD, MPH, FAAOS: Unwinding is a good word. I feel like I do quite a bit of that. It seems easier to talk someone into surgery than to talk them out of surgery. I had a patient with a laceration on the lateral aspect of the proximal interphalangeal joint, with a stable exam and no sensory issues. I explained that he would do quite well with a bit of immobilization, and he and his mother balked, stating: “This suture job was just temporary. You’re supposed to take the sutures out and fix me. That’s what the ER folks said.” It was shocking to me that they didn’t believe me when I told them he didn’t need surgery. You’d think they’d be so relieved to not need surgery.

Dr. Ring: That’s what psychologists call cognitive fusion. When a thought becomes a fact. When you say, “Looks good, doesn’t need surgery,” you are basically now arguing with the patient. You are dismissing their reality.

This was a good exploration of how cognitive bias affects a patient’s health and decisions. We also noticed how cognitive bias can harm the patient-surgeon relationship. I can imagine a checklist that patients use to be sure they are not being misled by common and expected misconceptions—something similar to the types of question prompt lists that are suggested for people considering surgery. Something to activate our system 2 (or analytical) thinking.

This roundtable addressed how cognitive bias in the patient affects health and safety. In part two, we’ll address the other humans in the equation and how cognitive bias in surgeons affects health and safety.

References 

  1. Reznik, AM: Evolution Versus Revolution in Artificial Intelligence: Why the Distinction? Available at: https://www.aaos.org/aaosnow/2019/dec/research/research03/. Accessed March 30, 2020.
  2. Maung R: All in a Day’s Work. Available at: https://thepathologist.com/outside-the-lab/all-in-a-days-work. Accessed March 30, 2020.
  3. Reznik AM, Urish K: Understanding the Impact of Artificial Intelligence on Orthopedic Surgery. Available at: https://www.aaos.org/aaosnow/2018/sep/research/research01/. Accessed March 30, 2020.
  4. Urish K, Reznik AM: How Would a Computer Diagnose Arthritis on a Radiograph? Available at: https://www.aaos.org/aaosnow/2018/dec/research/research04/. Accessed March 30, 2020.
  5. Reznik AM, Urish K: Natural Language Processing Provides a Foundation for AI in Medical Diagnoses. Available at: https://www.aaos.org/aaosnow/2019/mar/research/research01/. Accessed March 30, 2020.
  6. Collier B, MacLachlan J: Charles Babbage: And the Engines of Perfection. Oxford, England: Oxford University Press; 2000.
    Turing AM: Computing machinery and intelligence. Mind LIX, 1950;236:433-60.
  7. Moor J: The Dartmouth College Artificial Intelligence Conference: the next fifty years. AI Magazine 2006;27:87-9.
    Obermyer Z, Powers B, Vogeli C: Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019;366:447-53.
  8. Reznik AM: Applying the Four Basic Principles of Medical Ethics to Artificial Intelligence. Available at: https://www.aaos.org/aaosnow/2019/jul/research/research01/. Accessed March 30, 2020.
  9. Taylor NP: Paige Gets CE Mark for Prostate Cancer Detection Technology. Available at: https://www.medtechdive.com/news/paige-gets-ce-mark-for-prostate-cancer-detection-technology/567209/. Accessed March 30, 2020.
  10. Campanella G, Hanna MG, Geneslaw L, et al: Clinical-grade computational pathology using weakly supervised deep learning on whole slide images. Nat Med 2019;25:1301-9.
  11. K Health: Homepage. Available at: https://www.khealth.ai/. Accessed March 30, 2020.

Shared decision-making takeaways from the roundtable

Karl C. Roberts, MD, FAAOS

Surgeons, by nature, are busy and have significant time constraints in a normal day. As such, we often take shortcuts in decision-making when treating patients with common and known diagnoses. We tend to cut to the diagnosis and treatment portion of the patient visit without taking the time to communicate and properly identify a patient’s true values, which is essential for arriving at the correct treatment method through shared decision-making. We risk performing surgery even though identifying a patient’s misconceptions about that surgery may have
changed the person’s preferences for treatment. Or we risk performing surgery on a patient who has unrealistic expectations about the outcome of surgery, which could affect satisfaction with the procedure or the surgeon. Understanding the potential risks of using cognitive shortcuts and how they may affect our patients in the setting of unrecognized misperceptions is a quality and safety issue.

Karl C. Roberts, MD, FAAOS, is program director of Spectrum Health at Michigan State University Orthopaedic Surgery Residency Program, as well as an AAOS Now Editorial Board member.