Causal Reductionism Misses the Forest for the Trees
The modern habit of explaining complex phenomena through single causes genes, incentives, trauma, class produces confident explanations that are technically precise and fundamentally misleading.
"The main idea behind complex systems is that the ensemble behaves in ways not predicted by the components. The interactions matter more than the nature of the units. Studying individual ants will never give us an idea on how the ant colony operates." Nassim Nicholas Taleb
Causal reductionism is the intellectual habit of attributing a complex outcome to a single cause, or a small set of causes, when the real explanation lies in the interactions between many factors operating at different scales. It is the epistemic equivalent of looking at a single tree and declaring you understand the forest. The habit is deeply embedded in modern thought because it maps well onto scientific methodology isolate a variable, measure its effect but it fails catastrophically when applied to systems where the interactions between variables matter more than the variables themselves.
Taleb's distinction between complicated and complex systems is essential here. A car engine is complicated: it has many parts, but their interactions are predictable. Remove the alternator and you know what will happen. A society is complex: its parts interact in non-linear ways, small inputs can produce outsized outputs, and the behavior of the whole cannot be predicted from the behavior of the parts. Yet our dominant mode of explanation in social science, journalism, policy analysis treats complex systems as if they were merely complicated. We look for "the cause" of poverty, radicalization, institutional failure, or cultural change, when these phenomena emerge from webs of interaction that resist single-variable explanation.
This connects to Scott's critique of high-modernist planning and Postman's analysis of Technopoly: both describe the same underlying error. The state that replaces diverse forests with monocultures, the culture that replaces wisdom with data, and the analyst who replaces systemic understanding with a single causal arrow are all making the same mistake. They are substituting legibility for accuracy, trading genuine (but messy) understanding for the false clarity of a clean causal story.
Takeaway: When someone offers you a confident single-cause explanation for a complex social phenomenon, you are almost certainly looking at an oversimplification that reveals more about the explainer's framework than about reality.
See also: Legibility Kills What It Tries to Measure | Epistemic Legibility Not Everything Can Be Made Explicit | The Most Intolerant Minority Wins | The Narrative Fallacy Turns Correlation Into Causation