Causal Opacity

Causal Opacity is the idea that there is a barrier to understanding a system once it passes some threshold of complexity. Because of causal opacity you can neither make accurate predictions nor even really figure out how the system got into its current state. Some other ways of saying it would be,
  • You cannot explain how why things are how they will change
  • As complexity increases predictably decreases (this is my favorite formulation of the concept)
Because of the shroud of causal opacity,
  • The playing field is leveled among the intelligent and unintelligent, because no matter how much intelligence you have you still cannot make predictions
  • Plans are unlikely to work (but planning, especially nonpredictive planning, is still useful).
  • Figuring out how history will unfold is impossible.
  • Figuring out how history unfolded is also equally impossible.

Obviousness of the Observation

I can write a bunch about whether systems are complex or simple, emergence, number of edges in a graph, but fundamentally predictions are quite the crapshoot and everyone knows it. Nobody can pick stocks, figure out who will win the next election, know which conflicts will turn into wars, and on and on. Even at a smaller scale, like will a company succeed or will a relationship break up or flourish, nobody really knows.