The interpretation layer

Ethos

The intellectual foundations behind this work -- and the specific layer of reasoning it examines.

In the 1970s, behavioral psychologist Daniel Kahneman -- best known for his work on the psychology of judgment and decision-making -- and Amos Tversky, a key figure in the discovery of systematic human cognitive bias and handling of risk, ran a series of experiments that changed how scientists understood human judgment.

Their findings, published in their joint landmark 1974 paper Judgment Under Uncertainty: Heuristics and Biases, revealed that the mind doesn't reason carefully from evidence to conclusion.

"This article shows that people rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations. In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors."

Tversky & Kahneman, 1974

Organizational theorist Karl Weick observed something related about how people make sense of complex situations. In Sensemaking in Organizations (1995), he argued that meaning is mostly retrospective: people understand what they think by observing what they've already done, which means conclusions frequently arrive before the reasoning that appears to support them.

"In matters of sensemaking, believing is seeing. To believe is to notice selectively."

Weick, 1995, p. 133

To hold a belief is to prioritize what confirms it and then to acknowledge the ways that give it credence. The people who get things wrong with information are rarely bad at reasoning. More often, understanding arrives first -- as a feeling, a suspicion, an existing belief -- and then information gets gathered around it, not necessarily to test that understanding but to support it.

People make invisible choices about which question to ask, what to compare, what counts as evidence -- and then those choices do most of the work. A conclusion was already forming before looking at or considering any type of evidence.

Later, sociologist Joel Best spent years tracing how statistics move through culture in Damned Lies and Statistics (2001) -- how a number, repeated enough times, becomes a fact regardless of what it originally measured or whether it still does.

Best's concern was not with deliberate deception but with something more ordinary: the process by which an estimate loses its origins.

"The real trouble begins when people begin treating the guess as a fact, repeating the figure, forgetting how it came into being, embellishing it, developing an emotional stake in its promotion and survival."

Best, 2001, p. 38

What's less examined is what all of that looks like in practice -- in the specific moment when a question gets framed, a comparison gets chosen, a metric gets selected, or certainty gets performed. Not as a cognitive failure, but as a pattern. One that repeats across contexts, situations, and types of information.

In Plain Byte examines that layer.

The intellectual traditions are established. The specific language for what happens in those moments is developed here.

It examines what reasoning looks like when the interpretation layer is challenged -- what was chosen, what was excluded, what certainty was claimed beyond what the information actually supported.

The goal isn't to make readers doubt everything. It's to make the invisible choices visible.

What happens after that is up to the reader.

References

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131. https://doi.org/10.1126/science.185.4157.1124

Weick, K. E. (1995). Sensemaking in organizations. Sage Publications. ISBN: 978-0803971776

Best, J. (2001). Damned lies and statistics: Untangling numbers from the media, politicians, and activists. University of California Press. ISBN: 978-0520219786