No matches found
Try a different search term or clear the filter.
Before any reasoning begins, the quality of our information sets hard limits on the quality of our decisions. Garbage in, garbage out — but the garbage often looks perfectly clean from the inside.
Factual Ignorance knowledge
Simply not knowing a relevant fact. The most basic failure — deciding without the information the decision requires.
Investing in a sector without understanding its regulatory environment.
Outdated Information knowledge
Acting on facts that were once true but have since changed. A particularly dangerous failure because it feels like competence.
Applying a negotiation strategy that worked five years ago in a market that has fundamentally shifted.
Information Asymmetry access
One party has access to information another doesn't. You cannot compensate for what you cannot see.
Buying a used car from a seller who knows the vehicle's full history while you don't.
See: George Akerlof, "The Market for Lemons" (1970)
Analysis Paralysis overload
Too much information triggers decision avoidance or indefinite delay. Paradoxically, more data can lead to worse or no decisions.
Unable to choose a health insurance plan after reading 200 pages of options; defaulting to the worst plan by inaction.
See: Barry Schwartz, "The Paradox of Choice"
Authority Bias in Sources trust
Trusting a source because of who it is rather than evaluating the quality of its claims. Credentials and confidence are not the same as correctness.
Following a financial influencer's advice without checking their track record or conflicts of interest.
Unknown Unknowns blind spots
The information you don't know you're missing. The hardest epistemic failure to defend against because you can't search for what you don't know exists.
Launching a product in a new country without knowing a local competitor already owns that market segment.
See: Nassim Taleb, "The Black Swan"
Selective Exposure filter bubbles
Systematically consuming only information that confirms existing beliefs, creating an increasingly distorted model of reality over time.
Only reading media outlets that align with your political views, making your beliefs increasingly extreme and unfounded.
Even with perfect information, the human mind introduces systematic errors at the processing stage. These are not random mistakes — they are predictable, repeatable distortions built into how cognition works. Kahneman's System 1 vs System 2 framework is the canonical lens here.
Confirmation Bias bias
Seeking, interpreting, and remembering information in ways that confirm what you already believe. The mind becomes a lawyer, not a judge. One of the most robustly replicated findings in cognitive psychology — it appears across cultures, domains, and expertise levels.
A manager convinced a hire is talented ignores clear warning signs and over-weights every small success.
See: Kahneman, "Thinking, Fast and Slow"
Availability Heuristic heuristic
Judging the probability of something by how easily an example comes to mind. Vivid or recent events are dramatically overweighted.
Fearing plane crashes more than car accidents because crashes get media coverage, despite cars being statistically far more dangerous.
Anchoring Effect bias
The first number or piece of information seen disproportionately influences all subsequent judgments, even when the anchor is arbitrary.
A salary negotiation that starts at £40k anchors the entire conversation there, even if the role is worth £60k.
Loss Aversion bias
Losses feel more painful than equivalent gains feel pleasurable — Kahneman & Tversky estimated a roughly 2:1 ratio in their original work, though subsequent research shows this varies considerably by individual and context. The consistent finding is the asymmetry, not a fixed multiplier.
Holding a failing stock far too long to avoid "locking in" a loss, even when selling and reallocating would be clearly better.
See: Kahneman & Tversky, Prospect Theory (1979)
Overconfidence Bias bias
Systematically overestimating the accuracy of your own knowledge, predictions, and abilities. Experts are frequently more overconfident, not less.
A founder's financial projections are consistently 40% more optimistic than reality, every single year.
See: Philip Tetlock, "Superforecasting"
Sunk Cost Fallacy reasoning
Continuing an endeavor because of previously invested resources (time, money, effort) that cannot be recovered — rather than on future expected value.
Finishing a bad book, staying in a failing project, or running a product no one wants because "we've already invested so much."
Correlation ≠ Causation logic
Concluding that because two things happen together, one causes the other. One of the most common and consequential reasoning errors.
Concluding a product caused a sales increase when both were driven by a third factor (a viral moment) you hadn't noticed.
False Dilemma logic
Artificially reducing a complex situation to only two options when more exist. Creates unnecessary either/or pressure.
"We either fire half the team or shut down" — when refinancing, pivoting, or a staged restructure were viable alternatives.
Slippery Slope Fallacy logic
Assuming one step inevitably leads to extreme consequences without demonstrating the causal links in between.
Refusing a moderate policy because "it will inevitably lead to" an extreme outcome, without evidencing the mechanism.
Broken Mental Models frameworks
Using a mental model that was useful in one context in another where it doesn't apply. Maps that don't match the territory.
Applying a management style learned in a startup to a large corporation and wondering why nothing works.
Second-Order Blindness systems
Failing to anticipate the downstream consequences of consequences. Most unintended policy and strategic failures live here.
Reducing customer support costs without modeling what higher churn from poor support will cost in the long run.
See: Donella Meadows, "Thinking in Systems"
Narrow Framing frameworks
Defining the decision too narrowly so that the real range of options is never considered. The "whether or not" trap.
Asking "Should we hire this person?" instead of "What are all the ways we could solve this capacity problem?"
See: Heath & Heath, "Decisive"
Halo Effect bias
One positive trait (attractiveness, confidence, early success) causes you to assume other unrelated positive traits in someone or something.
Hiring a candidate because they went to a prestigious university without rigorously evaluating the actual skills required.
Emotions are not the enemy of good decisions — they are an essential signal. But unregulated or unexamined emotional states systematically distort what we perceive as rational. Trauma, fear, ego, and bodily states all shape decisions in ways we rarely acknowledge in the moment.
Fear-Based Decisions emotion
Decisions driven primarily by avoiding a feared outcome rather than pursuing a valued one. Fear prioritizes the short-term and the visible.
Staying in a miserable job for years because the fear of failure during a career change feels overwhelming.
Ego & Pride ego
Protecting self-image overrides updating on evidence. Admitting a mistake, reversing a decision, or asking for help is experienced as a threat to identity.
A leader doubles down on a failed strategy publicly rather than reversing course, because reversing feels like admitting they were wrong.
Physiological State Effects physiology
Hunger, anger, fatigue, and loneliness measurably degrade judgment quality. The underlying physiology (sleep deprivation reducing prefrontal function, blood glucose effects on self-control) is well-documented. Note: Baumeister's broader "ego depletion" theory — that willpower is a depletable resource like a muscle — has faced significant replication failures since 2015 and remains contested; the specific mechanism is debated even if the general phenomenon is real. The HALT acronym (Hungry, Angry, Lonely, Tired) originates in addiction recovery frameworks but maps usefully onto decision science.
Sending an aggressive email at midnight after a stressful day that you would never have sent at 10am well-rested.
See: Harrison & Horne on sleep deprivation (2000); Hagger et al. pre-registered replication (2016) for the ego depletion debate
Trauma-Informed Reactivity trauma
Past traumatic experiences create automatic responses that are appropriate for the past situation but misapplied in the present.
Avoiding all confrontation in a new, healthy workplace because past environments made conflict dangerous.
Present Bias / Hyperbolic Discounting impulse
The present is dramatically overweighted relative to the future. Immediate rewards are systematically preferred over larger future rewards.
Eating the cake today despite committing yesterday to a health goal you still care about. Both desires are real; recency wins.
See: Thaler & Sunstein, "Nudge"
Anxiety-Driven Avoidance emotion
Anxiety causes avoidance of the very situations needed to resolve the anxiety. Decisions are deferred or never made, compounding the problem.
Not opening financial statements because they cause anxiety, which allows financial problems to compound unchecked.
Dopamine-Driven Short-Termism reward
The brain's reward system prioritizes immediate dopamine hits over long-term wellbeing, creating predictable preference for novelty and immediacy.
Scrolling social media instead of working on a meaningful project, repeatedly, even when you genuinely prefer the project's outcome.
Emotional Reasoning emotion
Treating feelings as direct evidence about external reality. "I feel afraid, therefore this situation is dangerous."
Concluding a business idea is bad because it creates anxiety, when the anxiety comes from novelty rather than risk.
Often the individual is not the problem. The incentive structure, measurement system, or organizational design produces predictably bad decisions regardless of who is making them. Fix the person all you want — the system will win.
Incentive Misalignment incentives
What is rewarded diverges from what is actually valuable. People rationally pursue the incentive, not the intended outcome.
Sales teams maximizing commissions by selling inappropriate products to customers, because that's what the compensation structure rewards.
See: Charlie Munger on incentive-caused bias
Goodhart's Law measurement
"When a measure becomes a target, it ceases to be a good measure." Optimizing for the metric destroys the goal the metric was supposed to represent.
Teaching to standardized tests instead of teaching genuine understanding, because test scores became the official measure of educational quality.
Short-Term Incentive Horizon time
Quarterly earnings pressure, annual bonuses, election cycles — structural forces that reward short-term thinking at the expense of long-term value.
Cutting R&D budget to hit this quarter's profit target, while destroying the innovation pipeline for the next five years.
Institutional Inertia systems
Large systems resist change because decision-makers are insulated from the consequences of bad decisions. "We've always done it this way" becomes policy.
A hospital system continuing an inefficient process for 20 years because no one person has both the authority and incentive to change it.
Resource Scarcity Thinking constraint
Operating under scarcity (time, money, cognitive bandwidth) narrows attention and causes poor tradeoffs. Scarcity itself reduces decision quality.
Payday loan borrowing: financial scarcity reduces cognitive bandwidth, leading to decisions that worsen financial scarcity.
See: Mullainathan & Shafir, "Scarcity"
Diffusion of Responsibility systems
When many people share responsibility, each assumes someone else is handling it. Group decision-making can paradoxically produce no real decision.
A critical product bug goes unfixed for months because every team assumed another team owned the resolution.
Humans are intensely social animals. What our group believes, what our culture teaches, and what our community rewards shapes our decisions at a level far deeper than conscious reasoning. We underestimate this constantly.
Groupthink social
The desire for group harmony overrides realistic appraisal of alternatives. Dissenting voices are silenced; the group converges on a shared illusion of consensus.
A board unanimously approves an acquisition no individual member privately believed was wise, because no one wanted to be the dissenter.
See: Irving Janis, "Victims of Groupthink" (1972)
Authority Obedience social
People defer to authority figures — even when those authorities are wrong, harmful, or acting outside their domain of competence.
A team implements a strategy they all know is flawed because the CEO is visibly committed to it and no one will say so.
See: Stanley Milgram, "Behavioral Study of Obedience," Journal of Abnormal Psychology (1963)
Social Proof Bias social
Using what others are doing as the primary guide for what to do, especially under uncertainty. "If everyone is doing it, it must be right."
Joining a speculative investment frenzy because "everyone" seems to be making money, without independently evaluating the fundamentals.
See: Cialdini, "Influence"
Social Approval Motivation social
Making decisions that maximize social approval rather than personal values or objective outcomes. Identity management replaces judgment.
Choosing a career path parents approve of rather than one that aligns with genuine skills and interests.
Cultural Conditioning culture
Deep assumptions inherited from culture — about risk, gender, hierarchy, time, success — shape decisions invisibly and are rarely examined.
A culture that equates busyness with virtue systematically overworks its members and devalues rest, recovery, and deep thinking.
See: Jonathan Haidt, "The Righteous Mind"
Peer Comparison Trap social
Using a reference group's choices to set your own goals rather than independently determining what you actually value.
Buying a house you can't comfortably afford because peers at the same career stage are buying houses.
Family & Childhood Conditioning culture
Patterns, beliefs, and behavioral scripts formed in childhood persist as defaults in adult decision-making, often without conscious awareness.
Consistently avoiding conflict in professional settings because conflict was dangerous in your childhood environment.
The conditions under which a decision is made matter as much as the decision itself. Time pressure, stress, and cognitive load systematically shift processing from careful deliberation to reactive shortcuts.
Time Pressure Shortcuts stress
Urgency compresses deliberation time and forces reliance on heuristics, the first option considered, or the loudest voice in the room.
A hiring decision made in 48 hours due to "business urgency" that would have been made very differently with a normal timeline.
Decision Fatigue depletion
The quality of decisions may deteriorate over a sequence of choices, with later decisions defaulting toward inaction or the easier option. This is plausible and widely observed, though the specific psychological mechanism (ego depletion) remains debated after replication failures. The broader principle — that cognitive resources are finite and depleted by use — is well-supported in applied settings.
The widely-cited Danziger et al. (2011) parole study found judges granted parole far more often after breaks than before them. However, later re-analyses suggested scheduling patterns (simpler cases after breaks) may partially explain the effect — making it a more complex finding than originally reported.
See: Danziger et al. (2011); Weinshall-Margel & Shapard (2011) for methodological critique
Planning Fallacy time
Systematically underestimating how long tasks will take and how much they will cost, even when you have direct evidence from past projects.
The Sydney Opera House was originally estimated to be complete in 6 years (by 1963); it took 16, finishing in 1973 at roughly 14× its original budget.
See: Kahneman & Tversky (1979)
Urgency Bias attention
Urgent but less important tasks consistently crowd out important but non-urgent ones. The tyranny of the inbox over the strategic.
Spending all day responding to emails while the company's core strategy goes unexamined for another quarter.
See: Stephen Covey popularized this in "The 7 Habits" (1989); the underlying attention-allocation problem is documented in time-use and productivity research
Ignoring Long-Term Consequences time
Systematically discounting what happens beyond a certain time horizon. Distant negative consequences feel unreal while immediate benefits feel vivid.
Climate policy: short-term economic disruption feels real and present; long-term catastrophe feels abstract and deferrable.
Humans are intensely social animals. What our group believes, what our culture teaches, and what our community rewards shapes our decisions at a level far deeper than conscious reasoning. We underestimate this constantly.