Glossary

A diagnostic vocabulary for the interpretive choices that precede data analysis.

These patterns draw on cognitive science, the philosophy of measurement, and organizational behavior — but the terms are ours, developed for the specific context of how people interpret information before they reach conclusions.

A

Aggregation Trap

Framing

When combining data points erases the very differences that matter most. Averages that hide bimodal distributions, totals that mask regional variation. The summary becomes a disguise.

Anchoring Drift

Framing

How an initial data point or estimate silently sets the range for all subsequent analysis. The first number you see becomes the gravitational center — everything after is adjustment, not independent thought.

Attribution Games

Judgment

The pattern of deciding causation based on narrative convenience rather than evidence. How we assign credit or blame reveals more about our assumptions than about what actually happened.

B

Baseline Shopping

Judgment

Choosing your starting point to guarantee the conclusion you want. Every trend line begins somewhere — and that somewhere is rarely an accident.

C

Certainty Debt

Humility

The accumulated organizational cost of presenting uncertain findings as settled facts. Each confident assertion that skips the caveats borrows against future credibility.

Comparison Bias

Judgment

Every "compared to what?" shapes the narrative. The baseline you choose determines whether something looks like growth or decline, success or failure. The comparison is the argument.

Confidence Inflation

Humility

How certainty compounds through organizational layers. An analyst's tentative finding becomes a manager's solid insight becomes an executive's known fact. Each retelling strips uncertainty.

Confirmation Lock

Humility

When no possible evidence could change your conclusion. A state where the analytical framework has hardened into ideology — the question has been replaced by its answer.

Context Stripping

Framing

Removing the conditions under which data was collected, making findings appear universal when they were situational. The number travels; the footnotes don't.

Correlation Theater

Signal

Compelling patterns that don't survive basic skepticism. Two variables moving together presented as meaningful relationship, when coincidence or a hidden third variable explains everything.

D

Dashboard Insulation

Signal

Metrics chosen to protect rather than inform. When the purpose of a dashboard shifts from revealing truth to confirming the decisions already made.

Decorative Analytics

Signal

Analysis that exists to appear rigorous rather than to inform decisions. Charts in presentations that nobody references, reports that are generated but never read. The opposite of decision-forcing analytics.

Denominator Neglect

Signal

Reacting to the numerator while ignoring the base. A 50% increase sounds dramatic until you realize it went from 2 to 3. The denominator is where the story actually lives.

E

Ethical Myopia

Judgment

Optimizing a model or metric without examining who it harms. When the people affected by a measurement have no voice in how it was designed or deployed.

Evidence Sequencing

Framing

How the order in which you encounter information shapes the conclusion you reach. The same facts, rearranged, tell a different story. Sequence is a silent argument.

Explainability Gap

Humility

The distance between what a model does and what its users understand about why. When complexity is confused with competence, and opacity is mistaken for sophistication.

F

False Dichotomy

Framing

A vs. B framing when the real question is "should we be asking this at all?" Reducing complex situations to binary choices that obscure the actual decision space.

False Precision Syndrome

Humility

Reporting metrics to three decimal places when you're uncertain about the order of magnitude. Precision theater that creates confidence where none is warranted.

Frame Dependence

Framing

The phenomenon where the same underlying data supports different conclusions depending on how the question was originally posed. The frame isn't neutral — it's the first analytical decision.

G

Garbage In, Gospel Out

Signal

When flawed input data is laundered through sophisticated models until the output is treated as truth. The complexity of the method becomes a credibility substitute for the quality of the inputs.

I

Interpretive Labor

Judgment

The invisible work of turning raw data into meaning. Someone chose the axis, set the scale, picked the color. That labor is always present and never neutral.

L

Legibility Bias

Signal

Favoring data that is easy to capture and display over data that would actually inform the decision. The preference for clean numbers over messy but important truths.

M

Metric Advocacy

Judgment

Who benefits from this framing reveals who chose the metrics. Every metric selection is an act of emphasis — highlighting some truths while rendering others invisible.

N

Null Result Disappearance

Signal

The systematic invisibility of findings that show no effect. Studies that find nothing don't get published, analyses that confirm the status quo don't get presented. Absence of evidence quietly becomes evidence of absence.

O

Objectivity Illusion

Judgment

The belief that your dashboards, analyses, or reports are neutral. Every visualization is an argument. Every aggregation is a choice. "Letting the data speak" is impossible — someone always holds the microphone.

Outsourced Judgment

Judgment

Deferring decisions to models or algorithms to avoid accountability. The algorithm didn't decide — someone decided to let the algorithm decide, and that's the judgment that matters.

P

Precision Theater

Humility

Organizations performing certainty they don't have. Forecasts with false exactness, confidence intervals nobody believes, projections treated as plans. The cost is not the wrong number — it's the inability to say "we don't know."

Proxy Collapse

Framing

When a measurement becomes a target and stops being a useful measurement. The metric that once represented something real becomes the thing people optimize for directly, severing its connection to the original intent.

Q

Question Debt

Framing

The accumulated cost of skipping clarification before building dashboards, models, or analyses. Like technical debt, but in the framing layer — you move fast now and pay later when the wrong question produces a confident wrong answer.

R

Recency Weighting

Framing

Treating the latest data point as the most representative, regardless of whether it is. Yesterday's number drowns out last year's pattern. The new replaces the true.

S

Scope Creep (Analytical)

Framing

When an analysis designed to answer one question quietly expands to imply answers to many. The original frame stretches until it covers territory it was never built to map.

Stated vs. Measured Gap

Framing

When "customer satisfaction" becomes "NPS score" without examining whether they measure the same thing. The distance between the concept someone wants to understand and the proxy that actually gets tracked.

Streetlight Effect

Signal

Measuring what's easy rather than what matters. Named after the joke about searching for lost keys under the streetlight — not because that's where they fell, but because the light is better there.

Survivorship Lens

Signal

Drawing conclusions only from what made it through a filter — the companies that survived, the patients who recovered, the strategies that worked — while the failures that would reshape the analysis remain invisible.

T

Threshold Arbitrage

Judgment

Gaming the cutoff points that determine success or failure. When people optimize to cross the threshold rather than improve the underlying thing being measured.

U

Uncertainty Laundering

Humility

The process by which doubt is systematically removed as data moves through an organization. A range becomes a point estimate. A point estimate becomes a table. A table becomes a citation. Each step strips the uncertainty that was the most honest part of the original finding.

Unknowable vs. Unknown

Humility

The critical distinction between what you haven't looked for (fixable with effort) and what you fundamentally cannot know (requiring a different approach entirely). Confusing the two leads to either false confidence or premature surrender.

V

Vanity Metrics Theater

Signal

Tracking things that feel important but inform nothing. Metrics that go up and to the right without connecting to any decision anyone would actually make differently.

W

Witness Effect

Judgment

Data changes when people know it's being collected. Engagement metrics spike during reviews. Activity logs look different when leadership is watching. The measurement and the thing being measured are not as separate as the analysis assumes.

Go deeper

See these patterns in context

Each concept is explored through real examples in our essays.