Glossary
A diagnostic vocabulary for the interpretive choices that precede data analysis.
These patterns draw on cognitive science, the philosophy of measurement, and organizational behavior — but the terms are ours, developed for the specific context of how people interpret information before they reach conclusions.
Filter by lens
Aggregation Trap
FramingWhen combining data points erases the very differences that matter most. Averages that hide bimodal distributions, totals that mask regional variation. The summary becomes a disguise.
Anchoring Drift
FramingHow an initial data point or estimate silently sets the range for all subsequent analysis. The first number you see becomes the gravitational center — everything after is adjustment, not independent thought.
Attribution Games
JudgmentThe pattern of deciding causation based on narrative convenience rather than evidence. How we assign credit or blame reveals more about our assumptions than about what actually happened.
Baseline Shopping
JudgmentChoosing your starting point to guarantee the conclusion you want. Every trend line begins somewhere — and that somewhere is rarely an accident.
Certainty Debt
HumilityThe accumulated organizational cost of presenting uncertain findings as settled facts. Each confident assertion that skips the caveats borrows against future credibility.
Comparison Bias
JudgmentEvery "compared to what?" shapes the narrative. The baseline you choose determines whether something looks like growth or decline, success or failure. The comparison is the argument.
Confidence Inflation
HumilityHow certainty compounds through organizational layers. An analyst's tentative finding becomes a manager's solid insight becomes an executive's known fact. Each retelling strips uncertainty.
Confirmation Lock
HumilityWhen no possible evidence could change your conclusion. A state where the analytical framework has hardened into ideology — the question has been replaced by its answer.
Context Stripping
FramingRemoving the conditions under which data was collected, making findings appear universal when they were situational. The number travels; the footnotes don't.
Correlation Theater
SignalCompelling patterns that don't survive basic skepticism. Two variables moving together presented as meaningful relationship, when coincidence or a hidden third variable explains everything.
Dashboard Insulation
SignalMetrics chosen to protect rather than inform. When the purpose of a dashboard shifts from revealing truth to confirming the decisions already made.
Decorative Analytics
SignalAnalysis that exists to appear rigorous rather than to inform decisions. Charts in presentations that nobody references, reports that are generated but never read. The opposite of decision-forcing analytics.
Denominator Neglect
SignalReacting to the numerator while ignoring the base. A 50% increase sounds dramatic until you realize it went from 2 to 3. The denominator is where the story actually lives.
Ethical Myopia
JudgmentOptimizing a model or metric without examining who it harms. When the people affected by a measurement have no voice in how it was designed or deployed.
Evidence Sequencing
FramingHow the order in which you encounter information shapes the conclusion you reach. The same facts, rearranged, tell a different story. Sequence is a silent argument.
Explainability Gap
HumilityThe distance between what a model does and what its users understand about why. When complexity is confused with competence, and opacity is mistaken for sophistication.
False Dichotomy
FramingA vs. B framing when the real question is "should we be asking this at all?" Reducing complex situations to binary choices that obscure the actual decision space.
False Precision Syndrome
HumilityReporting metrics to three decimal places when you're uncertain about the order of magnitude. Precision theater that creates confidence where none is warranted.
Frame Dependence
FramingThe phenomenon where the same underlying data supports different conclusions depending on how the question was originally posed. The frame isn't neutral — it's the first analytical decision.
Garbage In, Gospel Out
SignalWhen flawed input data is laundered through sophisticated models until the output is treated as truth. The complexity of the method becomes a credibility substitute for the quality of the inputs.
Interpretive Labor
JudgmentThe invisible work of turning raw data into meaning. Someone chose the axis, set the scale, picked the color. That labor is always present and never neutral.
Legibility Bias
SignalFavoring data that is easy to capture and display over data that would actually inform the decision. The preference for clean numbers over messy but important truths.
Metric Advocacy
JudgmentWho benefits from this framing reveals who chose the metrics. Every metric selection is an act of emphasis — highlighting some truths while rendering others invisible.
Null Result Disappearance
SignalThe systematic invisibility of findings that show no effect. Studies that find nothing don't get published, analyses that confirm the status quo don't get presented. Absence of evidence quietly becomes evidence of absence.
Objectivity Illusion
JudgmentThe belief that your dashboards, analyses, or reports are neutral. Every visualization is an argument. Every aggregation is a choice. "Letting the data speak" is impossible — someone always holds the microphone.
Outsourced Judgment
JudgmentDeferring decisions to models or algorithms to avoid accountability. The algorithm didn't decide — someone decided to let the algorithm decide, and that's the judgment that matters.
Precision Theater
HumilityOrganizations performing certainty they don't have. Forecasts with false exactness, confidence intervals nobody believes, projections treated as plans. The cost is not the wrong number — it's the inability to say "we don't know."
Proxy Collapse
FramingWhen a measurement becomes a target and stops being a useful measurement. The metric that once represented something real becomes the thing people optimize for directly, severing its connection to the original intent.
Question Debt
FramingThe accumulated cost of skipping clarification before building dashboards, models, or analyses. Like technical debt, but in the framing layer — you move fast now and pay later when the wrong question produces a confident wrong answer.
Recency Weighting
FramingTreating the latest data point as the most representative, regardless of whether it is. Yesterday's number drowns out last year's pattern. The new replaces the true.
Scope Creep (Analytical)
FramingWhen an analysis designed to answer one question quietly expands to imply answers to many. The original frame stretches until it covers territory it was never built to map.
Stated vs. Measured Gap
FramingWhen "customer satisfaction" becomes "NPS score" without examining whether they measure the same thing. The distance between the concept someone wants to understand and the proxy that actually gets tracked.
Streetlight Effect
SignalMeasuring what's easy rather than what matters. Named after the joke about searching for lost keys under the streetlight — not because that's where they fell, but because the light is better there.
Survivorship Lens
SignalDrawing conclusions only from what made it through a filter — the companies that survived, the patients who recovered, the strategies that worked — while the failures that would reshape the analysis remain invisible.
Threshold Arbitrage
JudgmentGaming the cutoff points that determine success or failure. When people optimize to cross the threshold rather than improve the underlying thing being measured.
Uncertainty Laundering
HumilityThe process by which doubt is systematically removed as data moves through an organization. A range becomes a point estimate. A point estimate becomes a table. A table becomes a citation. Each step strips the uncertainty that was the most honest part of the original finding.
Unknowable vs. Unknown
HumilityThe critical distinction between what you haven't looked for (fixable with effort) and what you fundamentally cannot know (requiring a different approach entirely). Confusing the two leads to either false confidence or premature surrender.
Vanity Metrics Theater
SignalTracking things that feel important but inform nothing. Metrics that go up and to the right without connecting to any decision anyone would actually make differently.
Witness Effect
JudgmentData changes when people know it's being collected. Engagement metrics spike during reviews. Activity logs look different when leadership is watching. The measurement and the thing being measured are not as separate as the analysis assumes.
Go deeper
See these patterns in context
Each concept is explored through real examples in our essays.