Topic: Cognitive Bias in Critical and Creative Thinking (in EMT) by. Mdm.Valentine @Narok campus.
Objectives. By the end of the lesson, the student should be able to: -Define cognitive bias and explain how it influences perception and decision-making in clinical practice. -Identify and correct common myths and misconceptions about cognitive bias in human thinking.
Cont. -Analyze the impact of cognitive bias on critical thinking and clinical judgment. -Apply effective de-biasing strategies to minimize errors and improve objectivity in decision-making.
Intro. Definition: Cognitive bias is a systematic error in thinking that affects how people perceive, interpret, and make decisions. It occurs when the brain relies on shortcuts or personal experiences instead of objective reasoning, leading to judgments that deviate from logical or factual accuracy. Cognitive biases can lead to faulty conclusions, misdiagnosis, or inappropriate interventions, especially in high-stress or time-sensitive situations.
Cont. Myths About Cognitive Bias Cognitive bias is a normal part of human thinking — but many misconceptions exist about what it is, who it affects, and how it can be controlled. Understanding these myths helps healthcare students and professionals recognize their own limitations and make better, more objective clinical decisions.
Cont. 1. Myth: Only unskilled or inexperienced people are biased. Many people believe that only those who lack knowledge or experience fall into bias. This is false because bias is not about intelligence or training—it’s a feature of how the human brain works.
Cont. Reality: -Everyone, regardless of education or skill level, is prone to cognitive bias. Even highly trained clinicians, doctors, and EMTs can make biased judgments, especially in high-pressure situations such as emergency care. -For example, an experienced EMT might assume that a patient with shortness of breath has asthma because most cases they’ve seen were asthma, overlooking other possible causes like heart failure or anxiety. -Professional experience helps decision-making but does not make one immune to bias. Awareness and reflection are key to managing it.
2. Myth: Cognitive biases can be completely eliminated. -Some believe that through training or experience, one can fully get rid of cognitive biases. Reality: -Cognitive biases are natural mental shortcuts (heuristics) that the human brain uses to process information quickly. Since they are built into our thinking patterns, they cannot be completely removed.
Cont. -However, they can be managed, minimized, or controlled through conscious awareness, structured reasoning, and evidence-based approaches. Even the most rational thinkers still experience subtle biases that influence perception and decision-making. -Instead of trying to eliminate bias, focus on recognizing it and applying strategies to reduce its negative effects.
Cont. 3. Myth: Being aware of cognitive bias automatically prevents it. -Many think that once they learn about cognitive biases, they will no longer fall victim to them.
Cont. Reality: -Awareness is only the first step. Knowing about bias doesn’t stop it from influencing decisions. Research shows that people continue to make biased choices even after being trained about bias. -To effectively reduce bias, one must actively practice de-biasing techniques, such as taking time to reconsider assumptions, seeking feedback, and using structured diagnostic checklists.
Cont. Example: An EMT may know about anchoring bias but still fixate on an initial diagnosis during a stressful emergency unless they deliberately pause and reassess the situation. Knowledge of bias is necessary but not sufficient — continuous self-monitoring and reflective practice are required.
Cont. 4. Myth: Cognitive biases are always bad. -It is commonly assumed that all biases lead to errors and must be avoided completely. Reality: Not all biases are harmful. In fact, some cognitive shortcuts are useful, especially in emergency care where quick decision-making is critical.
Cont. For example, using pattern recognition can help an EMT quickly identify cardiac arrest and start resuscitation without delay. This type of intuitive reasoning is a helpful form of cognitive bias. However, when these shortcuts are applied inappropriately or without verification, they can lead to serious diagnostic errors. Some biases can improve efficiency when used correctly, but clinicians must know when to switch from intuitive thinking to analytical thinking.
Cont. 5. Myth: Rational or intelligent people don’t experience bias. It’s often believed that people who are logical, analytical, or highly educated are free from bias. Reality: -Cognitive bias affects everyone, including intelligent and rational individuals. In fact, intelligent people may be better at rationalizing their biased decisions because they can use logic to defend their initial assumptions.
Cont. This makes it harder for them to see their mistakes or accept alternative viewpoints. -A highly skilled clinical officer might ignore a colleague’s suggestion because they are confident in their own reasoning, displaying overconfidence bias. -Intelligence does not protect anyone from bias. The best professionals remain humble, open-minded, and willing to re-evaluate their conclusions.
Cont. 6. Myth: Biases only affect personal opinions, not professional or clinical decisions. Some students or practitioners believe that bias is only seen in personal beliefs, not in clinical reasoning or medical practice. Reality: -Biases directly affect clinical judgment, diagnosis, and patient interaction. A clinician’s assumptions about a patient’s background, behavior, or symptoms can unconsciously influence diagnosis and treatment.
Cont. For instance, a healthcare worker may underestimate a patient’s pain because of stereotypes about gender, age, or social class. This can lead to poor care and ethical issues. Cognitive bias is not just a personal problem—it’s a clinical one that affects patient outcomes and must be actively addressed in healthcare training and practice.
Cont. 7. Myth: Biases are easy to detect in yourself. Many people assume they can easily identify when they are being biased. Reality: -Biases often work unconsciously — people are usually unaware when their thinking is being influenced. This is why they are called cognitive (mental) biases.
Cont. We can often recognize bias in others but fail to see it in ourselves, a phenomenon known as the “bias blind spot.” -Because self-awareness alone isn’t reliable, it’s important to use teamwork, feedback, and structured reflection to detect and correct biases.
Cont. Impact of Cognitive Bias on Critical Thinking 1. Impaired Objectivity Cognitive biases can cause healthcare students and practitioners to ignore factual evidence or overvalue personal experiences.
Cont. For example, an EMT might assume a patient with chest pain is having a heart attack because “most chest pain cases are cardiac,” even when signs point to another cause (like anxiety or indigestion). This prevents objective analysis and leads to premature closure in reasoning.
Cont. 2. Poor Decision-Making Critical thinking requires analyzing alternatives and weighing evidence. Biases can distort how options are perceived. Anchoring bias causes reliance on the first piece of information received (e.g., the patient’s initial report), leading to misdiagnosis. Confirmation bias makes a student seek only information that supports their first impression while ignoring contradictory data. These biases limit sound clinical judgment and reduce decision accuracy.
Cont. 3. Reduced Problem-Solving Ability Critical thinking involves flexibility and creativity in solving problems. Biases narrow one’s mental perspective. For instance, functional fixedness—seeing objects or procedures as usable only in traditional ways—can prevent innovative problem-solving during emergencies. This leads to rigid thinking, limiting adaptive responses during complex or unexpected clinical situations.
Cont. 4. Inaccurate Risk Assessment Cognitive biases distort how risks are perceived. Availability bias makes people overestimate the likelihood of events they have recently experienced or heard about. For example, after handling a severe case of poisoning, a clinician might ovediagnose poisoning in other patients with unrelated symptoms. This leads to misprioritization of patient needs and resource misuse.
Cont. 5. Compromised Clinical Reasoning and Judgment Biases interfere with the logical sequence of assessment, diagnosis, and management. Overconfidence bias can make a student disregard peer input or guidelines, assuming their judgment is always correct. This undermines the self-evaluation and reflection necessary for effective critical thinking.
Cont. 6. Emotional and Social Distortions Some biases are emotionally or socially driven—such as halo effect (judging based on one positive trait) or stereotyping. A student might treat certain patients more favorably or unfairly due to preconceived notions. This not only reduces fairness and empathy but also affects ethical and rational decision-making.
Cont. 7. Hindrance to Learning and Professional Growth Critical thinking thrives on open-mindedness, questioning, and evidence-based reflection. When biases go unchecked, students may resist feedback or avoid changing incorrect beliefs. This limits academic growth, clinical competence, and professional development.
Summary Table Impact Area Result of Cognitive Bias Effect on Critical Thinking Objectivity Decisions based on assumptions Reduced evidence-based reasoning Decision-Making Anchoring and confirmation errors Faulty clinical judgment Problem-Solving Mental rigidity Limited creativity Risk Assessment Over/underestimating risk Misprioritization Clinical Judgment Overconfidence, shortcuts Compromised reasoning Emotional Influence Stereotyping, favoritism Unethical decisions Learning Resistance to correction Stunted intellectual growth
Cont. De-biasing Strategies in Clinical and Critical Thinking De-biasing means applying deliberate methods to recognize and minimize bias in thinking and decision-making. These strategies promote logical, reflective, and evidence-based practice among healthcare students.
Cont. 1. Self-Awareness and Reflection Recognizing personal biases is the first step. Engage in metacognition (thinking about your thinking). Keep a reflective journal to record decisions and evaluate what influenced them.
Cont. Example: After misjudging a case due to assumptions, a student reflects on how that bias occurred and how to avoid it next time. Impact: Promotes mindfulness and objectivity in reasoning.
Cont. 2. Use of Checklists and Clinical Guidelines Structured approaches such as protocols or decision algorithms ensure all possibilities are considered systematically. Example: Using a trauma assessment checklist ensures every body system is examined rather than focusing on visible injuries only. Impact: Encourages evidence-based and consistent decision-making.
Cont. 3. Seeking Feedback and Peer Review Receiving feedback from supervisors or colleagues helps identify blind spots. Example: Presenting a case for peer review can reveal anchoring or confirmation bias. Impact: Enhances learning and accountability in clinical reasoning.
Cont. 4. Slowing Down the Thinking Process (Cognitive Forcing) When time allows, slow down and challenge first impressions. Example: Before confirming “appendicitis,” a student asks, “Could this be urinary tract infection or gastroenteritis?” Impact: Encourages thorough evaluation and reduces premature conclusions.
Cont. 5. Exposure to Diverse Clinical Experiences Working in varied clinical areas broadens knowledge and reduces stereotypes. Example: Rotations in pediatric, obstetric, and surgical wards expose learners to diverse patient presentations. Impact: Reduces overgeneralization and promotes balanced reasoning.
Cont. 6. Training in Critical Thinking and Evidence-Based Practice Formal education on reasoning, logic, and research analysis equips learners to challenge bias. Example: Reviewing literature on disease presentations helps students rely on scientific data rather than personal impressions. Impact: Strengthens analytical and evidence-based reasoning skills.
Cont. 7. Promoting Open Dialogue and Team Communication Encouraging respectful debate and questioning reduces groupthink and authority bias. Example: Students discussing differential diagnoses openly during ward rounds ensures multiple viewpoints are considered. Impact: Improves collaboration and decision accuracy.
Cont. 8. Simulation and Case-Based Learning Simulation exercises allow learners to experience biases in safe environments and reflect on them. Example: In a simulated cardiac arrest, students learn to avoid assuming “the patient is dead” before completing resuscitation steps. Impact: Builds real-world awareness of bias and supports practical reasoning.
Cont. 9. Mindfulness and Emotional Regulation Emotional control helps prevent rash or emotionally charged decisions. Example: Taking deep breaths before responding to a difficult patient prevents emotional bias. Impact: Enhances calmness, fairness, and objectivity.
Cont. 10. Continuous Learning and Mentorship Regular training, workshops, and mentorship support lifelong growth. Example: Mentors can help students reflect on difficult cases and identify areas of bias. Impact: Encourages continuous self-improvement and critical awareness.