Table of Contents

~14 min read

Decision Science · Cognitive Psychology

Why Humans Make Bad Decisions

A comprehensive taxonomy of decision errors — spanning knowledge gaps, cognitive biases, psychological forces, social pressures, and the systems we operate within.

YB

Humans make bad decisions because they see reality incompletely (epistemic), interpret it wrongly (cognitive), distort it emotionally (psychological), are shaped by their environment (social), operate under misaligned incentives (structural), are constrained by time and stress (temporal), and rarely question their own thinking (meta-cognitive). No single cause explains everything — most poor decisions involve several of these layers at once.

A note on evidence quality: Psychology has undergone a significant replication crisis since ~2010. Some classic findings in this field — particularly around ego depletion, power poses, and certain priming effects — have not held up under rigorous pre-registered replication. Where entries in this guide touch on contested or partially replicated research, that is noted on the card. The majority of findings here (cognitive biases, social influence, epistemic failures) are well-replicated and drawn from mainstream cognitive and social psychology.

FILTER:

No matches found

Try a different search term or clear the filter.

01 ● Epistemic
Knowledge Failures
Problems with the raw information we hold

Before any reasoning begins, the quality of our information sets hard limits on the quality of our decisions. Garbage in, garbage out — but the garbage often looks perfectly clean from the inside.

Factual Ignorance knowledge
Simply not knowing a relevant fact. The most basic failure — deciding without the information the decision requires.
Investing in a sector without understanding its regulatory environment.
Outdated Information knowledge
Acting on facts that were once true but have since changed. A particularly dangerous failure because it feels like competence.
Applying a negotiation strategy that worked five years ago in a market that has fundamentally shifted.
Information Asymmetry access
One party has access to information another doesn't. You cannot compensate for what you cannot see.
Buying a used car from a seller who knows the vehicle's full history while you don't.
See: George Akerlof, "The Market for Lemons" (1970)
Analysis Paralysis overload
Too much information triggers decision avoidance or indefinite delay. Paradoxically, more data can lead to worse or no decisions.
Unable to choose a health insurance plan after reading 200 pages of options; defaulting to the worst plan by inaction.
See: Barry Schwartz, "The Paradox of Choice"
Authority Bias in Sources trust
Trusting a source because of who it is rather than evaluating the quality of its claims. Credentials and confidence are not the same as correctness.
Following a financial influencer's advice without checking their track record or conflicts of interest.
Unknown Unknowns blind spots
The information you don't know you're missing. The hardest epistemic failure to defend against because you can't search for what you don't know exists.
Launching a product in a new country without knowing a local competitor already owns that market segment.
See: Nassim Taleb, "The Black Swan"
Selective Exposure filter bubbles
Systematically consuming only information that confirms existing beliefs, creating an increasingly distorted model of reality over time.
Only reading media outlets that align with your political views, making your beliefs increasingly extreme and unfounded.
02 ● Cognitive
Cognitive Biases & Reasoning Errors
How our minds systematically distort what we see

Even with perfect information, the human mind introduces systematic errors at the processing stage. These are not random mistakes — they are predictable, repeatable distortions built into how cognition works. Kahneman's System 1 vs System 2 framework is the canonical lens here.

Confirmation Bias bias
Seeking, interpreting, and remembering information in ways that confirm what you already believe. The mind becomes a lawyer, not a judge. One of the most robustly replicated findings in cognitive psychology — it appears across cultures, domains, and expertise levels.
A manager convinced a hire is talented ignores clear warning signs and over-weights every small success.
See: Kahneman, "Thinking, Fast and Slow"
Availability Heuristic heuristic
Judging the probability of something by how easily an example comes to mind. Vivid or recent events are dramatically overweighted.
Fearing plane crashes more than car accidents because crashes get media coverage, despite cars being statistically far more dangerous.
Anchoring Effect bias
The first number or piece of information seen disproportionately influences all subsequent judgments, even when the anchor is arbitrary.
A salary negotiation that starts at £40k anchors the entire conversation there, even if the role is worth £60k.
Loss Aversion bias
Losses feel more painful than equivalent gains feel pleasurable — Kahneman & Tversky estimated a roughly 2:1 ratio in their original work, though subsequent research shows this varies considerably by individual and context. The consistent finding is the asymmetry, not a fixed multiplier.
Holding a failing stock far too long to avoid "locking in" a loss, even when selling and reallocating would be clearly better.
See: Kahneman & Tversky, Prospect Theory (1979)
Overconfidence Bias bias
Systematically overestimating the accuracy of your own knowledge, predictions, and abilities. Experts are frequently more overconfident, not less.
A founder's financial projections are consistently 40% more optimistic than reality, every single year.
See: Philip Tetlock, "Superforecasting"
Sunk Cost Fallacy reasoning
Continuing an endeavor because of previously invested resources (time, money, effort) that cannot be recovered — rather than on future expected value.
Finishing a bad book, staying in a failing project, or running a product no one wants because "we've already invested so much."
Correlation ≠ Causation logic
Concluding that because two things happen together, one causes the other. One of the most common and consequential reasoning errors.
Concluding a product caused a sales increase when both were driven by a third factor (a viral moment) you hadn't noticed.
False Dilemma logic
Artificially reducing a complex situation to only two options when more exist. Creates unnecessary either/or pressure.
"We either fire half the team or shut down" — when refinancing, pivoting, or a staged restructure were viable alternatives.
Slippery Slope Fallacy logic
Assuming one step inevitably leads to extreme consequences without demonstrating the causal links in between.
Refusing a moderate policy because "it will inevitably lead to" an extreme outcome, without evidencing the mechanism.
Broken Mental Models frameworks
Using a mental model that was useful in one context in another where it doesn't apply. Maps that don't match the territory.
Applying a management style learned in a startup to a large corporation and wondering why nothing works.
Second-Order Blindness systems
Failing to anticipate the downstream consequences of consequences. Most unintended policy and strategic failures live here.
Reducing customer support costs without modeling what higher churn from poor support will cost in the long run.
See: Donella Meadows, "Thinking in Systems"
Narrow Framing frameworks
Defining the decision too narrowly so that the real range of options is never considered. The "whether or not" trap.
Asking "Should we hire this person?" instead of "What are all the ways we could solve this capacity problem?"
See: Heath & Heath, "Decisive"
Halo Effect bias
One positive trait (attractiveness, confidence, early success) causes you to assume other unrelated positive traits in someone or something.
Hiring a candidate because they went to a prestigious university without rigorously evaluating the actual skills required.
03 ● Psychological
Psychological & Emotional Factors
When internal states override rational judgment

Emotions are not the enemy of good decisions — they are an essential signal. But unregulated or unexamined emotional states systematically distort what we perceive as rational. Trauma, fear, ego, and bodily states all shape decisions in ways we rarely acknowledge in the moment.

Fear-Based Decisions emotion
Decisions driven primarily by avoiding a feared outcome rather than pursuing a valued one. Fear prioritizes the short-term and the visible.
Staying in a miserable job for years because the fear of failure during a career change feels overwhelming.
Ego & Pride ego
Protecting self-image overrides updating on evidence. Admitting a mistake, reversing a decision, or asking for help is experienced as a threat to identity.
A leader doubles down on a failed strategy publicly rather than reversing course, because reversing feels like admitting they were wrong.
Physiological State Effects physiology
Hunger, anger, fatigue, and loneliness measurably degrade judgment quality. The underlying physiology (sleep deprivation reducing prefrontal function, blood glucose effects on self-control) is well-documented. Note: Baumeister's broader "ego depletion" theory — that willpower is a depletable resource like a muscle — has faced significant replication failures since 2015 and remains contested; the specific mechanism is debated even if the general phenomenon is real. The HALT acronym (Hungry, Angry, Lonely, Tired) originates in addiction recovery frameworks but maps usefully onto decision science.
Sending an aggressive email at midnight after a stressful day that you would never have sent at 10am well-rested.
See: Harrison & Horne on sleep deprivation (2000); Hagger et al. pre-registered replication (2016) for the ego depletion debate
Trauma-Informed Reactivity trauma
Past traumatic experiences create automatic responses that are appropriate for the past situation but misapplied in the present.
Avoiding all confrontation in a new, healthy workplace because past environments made conflict dangerous.
Present Bias / Hyperbolic Discounting impulse
The present is dramatically overweighted relative to the future. Immediate rewards are systematically preferred over larger future rewards.
Eating the cake today despite committing yesterday to a health goal you still care about. Both desires are real; recency wins.
See: Thaler & Sunstein, "Nudge"
Anxiety-Driven Avoidance emotion
Anxiety causes avoidance of the very situations needed to resolve the anxiety. Decisions are deferred or never made, compounding the problem.
Not opening financial statements because they cause anxiety, which allows financial problems to compound unchecked.
Dopamine-Driven Short-Termism reward
The brain's reward system prioritizes immediate dopamine hits over long-term wellbeing, creating predictable preference for novelty and immediacy.
Scrolling social media instead of working on a meaningful project, repeatedly, even when you genuinely prefer the project's outcome.
Emotional Reasoning emotion
Treating feelings as direct evidence about external reality. "I feel afraid, therefore this situation is dangerous."
Concluding a business idea is bad because it creates anxiety, when the anxiety comes from novelty rather than risk.
04 ● Structural
Structural & Incentive Failures
When the system itself generates bad decisions

Often the individual is not the problem. The incentive structure, measurement system, or organizational design produces predictably bad decisions regardless of who is making them. Fix the person all you want — the system will win.

Incentive Misalignment incentives
What is rewarded diverges from what is actually valuable. People rationally pursue the incentive, not the intended outcome.
Sales teams maximizing commissions by selling inappropriate products to customers, because that's what the compensation structure rewards.
See: Charlie Munger on incentive-caused bias
Goodhart's Law measurement
"When a measure becomes a target, it ceases to be a good measure." Optimizing for the metric destroys the goal the metric was supposed to represent.
Teaching to standardized tests instead of teaching genuine understanding, because test scores became the official measure of educational quality.
Short-Term Incentive Horizon time
Quarterly earnings pressure, annual bonuses, election cycles — structural forces that reward short-term thinking at the expense of long-term value.
Cutting R&D budget to hit this quarter's profit target, while destroying the innovation pipeline for the next five years.
Institutional Inertia systems
Large systems resist change because decision-makers are insulated from the consequences of bad decisions. "We've always done it this way" becomes policy.
A hospital system continuing an inefficient process for 20 years because no one person has both the authority and incentive to change it.
Resource Scarcity Thinking constraint
Operating under scarcity (time, money, cognitive bandwidth) narrows attention and causes poor tradeoffs. Scarcity itself reduces decision quality.
Payday loan borrowing: financial scarcity reduces cognitive bandwidth, leading to decisions that worsen financial scarcity.
See: Mullainathan & Shafir, "Scarcity"
Diffusion of Responsibility systems
When many people share responsibility, each assumes someone else is handling it. Group decision-making can paradoxically produce no real decision.
A critical product bug goes unfixed for months because every team assumed another team owned the resolution.
05 ● Social
Social & Cultural Influences
The invisible pressure of belonging and norms

Humans are intensely social animals. What our group believes, what our culture teaches, and what our community rewards shapes our decisions at a level far deeper than conscious reasoning. We underestimate this constantly.

Groupthink social
The desire for group harmony overrides realistic appraisal of alternatives. Dissenting voices are silenced; the group converges on a shared illusion of consensus.
A board unanimously approves an acquisition no individual member privately believed was wise, because no one wanted to be the dissenter.
See: Irving Janis, "Victims of Groupthink" (1972)
Authority Obedience social
People defer to authority figures — even when those authorities are wrong, harmful, or acting outside their domain of competence.
A team implements a strategy they all know is flawed because the CEO is visibly committed to it and no one will say so.
See: Stanley Milgram, "Behavioral Study of Obedience," Journal of Abnormal Psychology (1963)
Social Proof Bias social
Using what others are doing as the primary guide for what to do, especially under uncertainty. "If everyone is doing it, it must be right."
Joining a speculative investment frenzy because "everyone" seems to be making money, without independently evaluating the fundamentals.
See: Cialdini, "Influence"
Social Approval Motivation social
Making decisions that maximize social approval rather than personal values or objective outcomes. Identity management replaces judgment.
Choosing a career path parents approve of rather than one that aligns with genuine skills and interests.
Cultural Conditioning culture
Deep assumptions inherited from culture — about risk, gender, hierarchy, time, success — shape decisions invisibly and are rarely examined.
A culture that equates busyness with virtue systematically overworks its members and devalues rest, recovery, and deep thinking.
See: Jonathan Haidt, "The Righteous Mind"
Peer Comparison Trap social
Using a reference group's choices to set your own goals rather than independently determining what you actually value.
Buying a house you can't comfortably afford because peers at the same career stage are buying houses.
Family & Childhood Conditioning culture
Patterns, beliefs, and behavioral scripts formed in childhood persist as defaults in adult decision-making, often without conscious awareness.
Consistently avoiding conflict in professional settings because conflict was dangerous in your childhood environment.
06 ● Temporal
Temporal Pressures
How time and stress degrade thinking

The conditions under which a decision is made matter as much as the decision itself. Time pressure, stress, and cognitive load systematically shift processing from careful deliberation to reactive shortcuts.

Time Pressure Shortcuts stress
Urgency compresses deliberation time and forces reliance on heuristics, the first option considered, or the loudest voice in the room.
A hiring decision made in 48 hours due to "business urgency" that would have been made very differently with a normal timeline.
Decision Fatigue depletion
The quality of decisions may deteriorate over a sequence of choices, with later decisions defaulting toward inaction or the easier option. This is plausible and widely observed, though the specific psychological mechanism (ego depletion) remains debated after replication failures. The broader principle — that cognitive resources are finite and depleted by use — is well-supported in applied settings.
The widely-cited Danziger et al. (2011) parole study found judges granted parole far more often after breaks than before them. However, later re-analyses suggested scheduling patterns (simpler cases after breaks) may partially explain the effect — making it a more complex finding than originally reported.
See: Danziger et al. (2011); Weinshall-Margel & Shapard (2011) for methodological critique
Planning Fallacy time
Systematically underestimating how long tasks will take and how much they will cost, even when you have direct evidence from past projects.
The Sydney Opera House was originally estimated to be complete in 6 years (by 1963); it took 16, finishing in 1973 at roughly 14× its original budget.
See: Kahneman & Tversky (1979)
Urgency Bias attention
Urgent but less important tasks consistently crowd out important but non-urgent ones. The tyranny of the inbox over the strategic.
Spending all day responding to emails while the company's core strategy goes unexamined for another quarter.
See: Stephen Covey popularized this in "The 7 Habits" (1989); the underlying attention-allocation problem is documented in time-use and productivity research
Ignoring Long-Term Consequences time
Systematically discounting what happens beyond a certain time horizon. Distant negative consequences feel unreal while immediate benefits feel vivid.
Climate policy: short-term economic disruption feels real and present; long-term catastrophe feels abstract and deferrable.
07 ● Meta-Cognitive
Meta-Cognitive Failures
Not knowing how you think

The deepest layer. Even someone who has read everything above can still fail to examine their own thinking in practice. Metacognition — thinking about thinking — is the master skill that enables correcting all other errors. Its absence makes every other failure invisible.

Blind Spots About Blind Spots meta
The least self-aware people are consistently the most confident they are self-aware. Bias blindspot: knowing about biases doesn't make you less susceptible to them.
Someone who has read extensively about cognitive biases and believes this makes them immune to them — while exhibiting them consistently.
See: Pronin, Lin & Ross (2002)
Absence of Self-Questioning meta
Never asking "Why do I believe this?" or "What would it take for me to be wrong?" This is the habit that keeps every other error in place.
Building an entire strategy on an assumption that has never once been explicitly articulated or tested.
Rationalization Masquerading as Reasoning meta
The conclusion is chosen first (by emotion, identity, or desire), then reasons are constructed to justify it. This feels exactly like rational deliberation.
Deciding to hire someone you liked in the interview, then constructing a post-hoc case for why they're the most qualified candidate.
See: Jonathan Haidt, Social Intuitionist Model (2001) — originally developed for moral judgment, but the post-hoc rationalization pattern is broadly observed across decision domains
Identity-Protective Cognition meta
Beliefs that are tied to group identity become nearly impossible to update through evidence. Changing the belief would mean leaving the group.
A political belief held not because of evidence but because it defines membership in a tribe that provides meaning and belonging.
See: Dan Kahan, Yale Cultural Cognition Project
Fixed Mindset About Judgment meta
Believing that decision-making ability is fixed — that you either have good judgment or you don't — forecloses any deliberate effort to improve it. Note: Dweck's growth mindset research, while conceptually influential, has shown mixed results in large-scale replications; the specific intervention effects (e.g. in schools) are more modest than early studies suggested. The core insight about beliefs shaping effort and learning remains well-supported.
Never reviewing past decisions to extract lessons because the whole premise of "decision quality" as a learnable skill seems alien.
See: Carol Dweck, "Mindset"
Absence of Pre-Mortem Thinking meta
Never imagining, before committing, how a decision could fail. This forecloses identifying preventable failure modes while there's still time to act.
Launching a product without anyone on the team ever explicitly asking "How does this fail? What would have to be true for this to go wrong?"
See: Gary Klein, pre-mortem method
No Decision Hygiene Practice meta
Making important decisions without any structured process — no checklists, no devil's advocate, no cooling periods, no written pre-commitments.
A hiring committee with no rubric, no structured interview, and no calibration process — making consequential decisions entirely by feel.