The Hidden Connection Between Information and Consciousness

Two concepts haunt modern science like unsolved riddles. Information—the currency of the digital age, the stuff of genes and neurons, perhaps the fabric of reality itself. And consciousness—the inner light of experience, the fact that there's something it's like to be you reading these words.

For decades, these mysteries have been treated as separate problems, but that’s beginning to change. The connection becomes clearer once we ask what information actually is—and recognize that it comes in two distinct but intimately connected forms.

The first definition: Information as order—measured as the distance from the statistical distribution representative of thermodynamic equilibrium. This is information in something.

The ordered state is a low-entropy state, and entropy measures the system’s proximity to the most probable (equilibrium) state.

Therefore, a system is "far from equilibrium" if its components are statistically correlated, because correlation among components is order. When parts are correlated rather than independent, you have structure. The system occupies a state that’s improbable relative to chance. You can predict something about one part by knowing about another.

The opposite—maximum entropy—is defined by complete statistical independence among components. This is the “molecular chaos” assumption underlying Boltzmann’s H-theorem: at equilibrium, each particle’s state is statistically independent of every other’s. No pattern, no structure, no organization. Just randomness.

So, information IN something is internal statistical correlation—the degree to which a system’s components hang together rather than behave independently.

We have a formal measure for this: integrated information, Φ (phi), developed by the neuroscientist Giulio Tononi and colleagues. Φ quantifies how much a system is “more than the sum of its parts”—the information generated by the whole that can’t be reduced to its parts. A system with high Φ has deeply integrated components. Knowing about one part tells you about other parts. Φ measures the information IN something: the internal order, the departure from molecular chaos.

The second definition: Information as predictive data—information as knowledge. This is information about something.

Here the information encoded in a system has utility—a functional role in keeping the system far from equilibrium. The internal configuration is in some way isomorphic to relevant structure in the environment. It’s a model.

This is also about statistical correlation, but a different kind. Not correlation among a system’s internal components, but correlation between the system and something external—correlation with the environment. For example, a dolphin’s form is correlated with hydrodynamics, and an eagle’s wing is correlated with aerodynamics. The internal configuration mirrors relevant environmental structure. This correlation is information the system uses to maintain its ordered state.

We have a formal measure for this too: semantic information, developed by SFI’s David Wolpert and Artemy Kolchinsky and later by theoretical physicist Carlo Rovelli. Semantic information is the mutual information between a system and its environment that is causally necessary for the system’s continued existence. That means if you remove a gene that represents knowledge, you will get a dysfunctional organism that can’t maintain its far-from-equilibrium state.

Semantic information measures the information ABOUT something: external correlation that serves survival.

These two kinds of information are genuinely distinct. You could in principle have a highly integrated system (high Φ) that isn’t correlated with anything external—just an arbitrary pattern of internal dependencies. And you could have a system with environmental correlation but low integration—parts that track the environment independently without communicating with each other.

But here’s the key insight: in any stable system that persists over time, these two are causally coupled. Internal order (high Φ) requires anticipating and counteracting perturbations from the environment. To maintain the information IN, you need information ABOUT.

A system with high Φ but no semantic information is possible but unstable. The systems that persist—that maintain high Φ over time in noisy environments—are precisely those whose internal integration serves predictive purposes. The integration isn’t arbitrary; it’s about something. The internal correlations encode external regularities.

Sustainable Φ requires........

© Psychology Today