Your Brain Is More Conscious When It's More Unpredictable
A Perfectly Ordered Brain Is an Unconscious Brain
Here's a question that sounds like philosophy but turns out to be measurable physics: what does it look like when consciousness disappears?
Not metaphorically. Not poetically. What literally happens to the electrical signals in your brain when you go from awake and aware to fully unconscious under anesthesia?
Anesthesiologists have been watching this happen on monitors for decades. And the pattern is always the same. As a patient slips from wakefulness into unconsciousness, the chaotic, complex, seemingly random fluctuations in their brain's electrical activity gradually flatten out. The signals become simpler. More repetitive. More predictable. The brain starts to look, from an information theory perspective, like a radio stuck on one station instead of scanning across hundreds.
The technical word for that complexity, that unpredictability, is entropy. And the discovery that entropy tracks consciousness so reliably that you can literally build a clinical monitor around it might be one of the most underappreciated findings in modern neuroscience.
It raises a deeply strange possibility. What if consciousness isn't about which brain regions are active, or which chemicals are flowing, or which networks are connected? What if the thing that makes you you, the felt experience of being aware, is fundamentally about how complex and unpredictable your brain's activity is from one moment to the next?
Entropy: Not Disorder, but Richness
Before we go further, we need to untangle what entropy actually means. Because the common understanding of entropy, that it's "disorder" or "chaos," is misleading in a way that matters here.
In information theory, entropy measures uncertainty. Specifically, it measures how much information is contained in a signal. A signal with high entropy is one where knowing the current state tells you very little about the next state. It's unpredictable, varied, and rich with possibilities. A signal with low entropy is one where the pattern repeats, where the next state is easy to guess from the current one.
Think about it this way. Imagine two pieces of music. The first is a metronome: click, click, click, click. After hearing two clicks, you can predict the entire rest of the piece. That's low entropy. The second is a jazz improvisation by Thelonious Monk: unexpected intervals, shifting rhythms, surprising harmonic choices. You genuinely cannot predict what note comes next. That's high entropy.
Now here's the key. The jazz isn't random noise. Random noise has high entropy too, but it carries no meaning. Monk's improvisation has high entropy and structure. The notes relate to each other in ways that create coherent musical phrases. It's complex without being chaotic.
This distinction matters enormously for the brain. A healthy conscious brain isn't random. It's complex. Its signals are unpredictable enough to carry rich information but structured enough to support coherent thought and experience. Scientists sometimes call this the "edge of criticality," a sweet spot between rigid order and total randomness where information processing capacity is maximized.
Your brain, when you're fully conscious and engaged with the world, lives right on that edge.
The Entropic Brain Hypothesis: Robin Carhart-Harris's Big Idea
In 2014, a British neuroscientist named Robin Carhart-Harris published a paper that connected these dots in a way nobody had quite done before. Working at Imperial College London, Carhart-Harris had been studying what happens to the brain under the influence of psychedelics, specifically psilocybin (the active compound in magic mushrooms). And the data he was seeing told a very specific story.
Under psilocybin, brain entropy went up. Not a little. A lot. Across multiple brain regions, the complexity and unpredictability of neural signals increased significantly compared to normal waking consciousness. Functional connectivity patterns that were usually stable became fluid and dynamic. Brain networks that normally didn't talk to each other started exchanging information.
And here was the thing that made Carhart-Harris's paper so provocative: the subjects weren't just showing increased entropy. They were reporting the most profound, meaningful, awareness-expanding experiences of their lives.
Carhart-Harris proposed what he called the entropic brain hypothesis. The core claim is deceptively simple: the quality and intensity of conscious experience maps directly onto the entropy of brain activity. More entropy means more consciousness. Not "more" in the sense of being more alert (you can be very alert and have moderate entropy), but "more" in the sense of having a richer, more varied, more expansive quality of experience.
He laid out a spectrum. At the bottom sits deep unconsciousness: coma, general anesthesia, dreamless sleep. These states have the lowest brain entropy. Above that sits ordinary waking consciousness, the state you're in right now, which occupies a middle range. And above that, at the highest entropy levels, sit psychedelic states, certain types of meditation, and the kind of peak experiences that mystics have described for millennia.
The entropic brain hypothesis doesn't just say "more entropy equals more awareness." It maps a full spectrum. Dreamless sleep and coma sit at the bottom with the most ordered, predictable signals. Normal waking consciousness occupies the middle. Psychedelic states, certain mystical experiences, and early stages of psychosis sit at the top, where brain signals are most complex and unpredictable. The healthy brain navigates this spectrum constantly, spending most of its time in the middle range while briefly visiting higher- and lower-entropy states throughout the day.
What Imaging Actually Shows: Measuring Entropy Across Brain States
The entropic brain hypothesis would just be a clever idea if you couldn't test it. But you can. And researchers have, across dozens of studies using every major brain imaging modality.
Psychedelics: Entropy Goes Up
This is the most dramatic and well-replicated finding. Studies using fMRI, EEG, and MEG have shown that psychedelics including psilocybin, LSD, DMT, and ayahuasca all increase brain entropy compared to normal waking states.
A landmark 2019 study by Carhart-Harris and colleagues used fMRI to measure entropy across the entire cortex during psilocybin sessions. The increases weren't localized to one region. They were global. And critically, the degree of entropy increase correlated with the intensity of subjective experience. People who reported the most vivid imagery, the strongest sense of ego dissolution, and the deepest sense of meaning also showed the largest entropy increases.
EEG studies have confirmed this pattern using Lempel-Ziv complexity (LZc), a measure of how compressible a signal is. A highly compressible signal has low complexity (low entropy). A signal that resists compression has high complexity (high entropy). Under LSD, Lempel-Ziv complexity increases across virtually all EEG channels, and the increase correlates with subjective reports of altered consciousness.
Anesthesia: Entropy Drops
The opposite end of the spectrum is equally consistent. As patients go under general anesthesia, every measure of brain entropy drops. Sample entropy, permutation entropy, Lempel-Ziv complexity: all decline as consciousness fades.
This finding is so strong that it's been commercialized. GE Healthcare's Entropy Module, used in operating rooms worldwide, computes two entropy measures from frontal EEG, state entropy (SE) and response entropy (RE), to monitor anesthetic depth in real time. When entropy drops below a certain threshold, the anesthesiologist knows the patient is adequately sedated. When it starts to climb, the patient might be waking up.
| Conscious State | Brain Entropy Level | Signal Characteristics | Imaging Evidence |
|---|---|---|---|
| Deep sleep / Coma | Very low | Repetitive slow waves, highly predictable | EEG, fMRI show decreased complexity across cortex |
| General anesthesia | Low | Ordered, synchronous patterns, burst suppression | Frontal EEG entropy monitored clinically |
| Normal waking | Moderate | Complex, structured, balanced between order and randomness | Baseline for all entropy comparisons |
| Focused meditation | Moderate to high | Increased frontal complexity, reduced default mode rigidity | EEG shows region-specific entropy increases |
| REM dreaming | Moderate to high | Similar complexity to waking, with different network dynamics | fMRI entropy approaches waking levels |
| Psychedelic states | High | Globally increased complexity, novel connectivity patterns | fMRI and EEG show broad entropy increases |
Sleep: A Nightly Entropy Cycle
Every night, your brain runs through its own entropy roller coaster. During the transition from wakefulness to NREM sleep, entropy drops progressively. Stage 1 sleep shows a mild decrease. Stage 2 shows more. By the time you hit deep slow-wave sleep (stage 3), brain entropy is at its lowest point of the daily cycle. The signals become dominated by large, slow, highly predictable delta brainwaves.
Then something remarkable happens. During REM sleep, entropy bounces back up, approaching levels close to waking consciousness. This makes sense when you think about it. REM sleep is when you dream, and dreams, whatever their purpose, are vividly conscious experiences. The brain during REM is almost as complex as the brain during waking, which is consistent with the subjective experience of being "somewhere" and doing "something" in a dream, even if that something involves flying over a city made of cheese.
Disorders of Consciousness: Entropy as a Diagnostic Tool
Perhaps the most clinically significant application of brain entropy measures involves patients with disorders of consciousness. People in vegetative states, minimally conscious states, or coma present a deeply difficult diagnostic challenge. Some patients who appear unresponsive may have more residual awareness than their behavior suggests. Distinguishing between a truly vegetative state and a minimally conscious state matters enormously for treatment decisions and prognosis.
EEG entropy measures have shown promise here. A 2016 study by Casali and colleagues used a metric called the perturbational complexity index (PCI), which combines transcranial magnetic stimulation (TMS) with EEG to measure how the brain responds to perturbation. Conscious brains respond to TMS with complex, differentiated patterns. Unconscious brains respond with simple, stereotyped patterns.
The results were striking. PCI reliably distinguished between conscious and unconscious states with an accuracy that surpassed behavioral assessments. Some patients who were behaviorally classified as vegetative showed PCI values consistent with minimal consciousness, suggesting they had more awareness than anyone realized.
The "I Had No Idea" Moment: Your Brain's Entropy Changes Within Seconds
Most people assume that brain entropy is a slow, stable property, something that characterizes your overall state of consciousness like a thermostat setting. It isn't. EEG studies show that entropy fluctuates on a timescale of seconds, not minutes or hours.
When you shift your attention from a boring task to something genuinely engaging, entropy in frontal and parietal regions increases within seconds. When you close your eyes and relax, occipital entropy drops almost immediately as alpha brainwaves take over and the signal becomes more predictable. When a sudden loud noise startles you, entropy spikes briefly across the whole cortex as your brain scrambles to process the unexpected input.
Your conscious experience isn't sitting at one entropy level. It's constantly fluctuating, riding a wave of complexity that rises and falls with every shift in attention, every new thought, every sensory surprise. The richness of your moment-to-moment experience is literally a moving target, and EEG can track those shifts in real time.
This rapid fluctuation is part of what makes EEG so valuable for entropy research. fMRI has better spatial resolution, so it can tell you where in the brain entropy is changing. But it's slow, with a time resolution of about one to two seconds at best. EEG captures changes at the millisecond level, which means it can track the kind of rapid entropy fluctuations that characterize the moment-to-moment texture of conscious experience.

How Scientists Actually Compute Brain Entropy
If you're wondering how you take a squiggly EEG line and turn it into a number that quantifies "how complex" it is, that's a reasonable question. Several algorithms exist, each capturing a slightly different aspect of signal complexity.
Sample entropy (SampEn). This measures the probability that similar patterns in the data will remain similar at the next time step. A signal where short patterns keep repeating in predictable ways has low sample entropy. A signal where similar-looking segments diverge unpredictably has high sample entropy. SampEn is one of the most widely used measures in EEG consciousness research because it's relatively strong to noise and works well with the signal lengths typical of EEG recordings.
Permutation entropy (PE). This approach converts the time series into a sequence of ordinal patterns (whether each data point is higher or lower than its neighbors), then calculates the entropy of that pattern sequence. If the same ordinal patterns keep recurring, PE is low. If the patterns are varied and unpredictable, PE is high. Permutation entropy is computationally efficient and has been validated across multiple consciousness studies.
Lempel-Ziv complexity (LZc). Originally developed for data compression, Lempel-Ziv measures how many distinct patterns exist in a binary sequence. To apply it to EEG, you first convert the signal to binary by thresholding it at the median. Then the algorithm counts how many unique subsequences appear. A repetitive signal can be described with few subsequences (low LZc). A complex signal requires many (high LZc). This is the metric that showed such dramatic increases under psychedelics.
Perturbational complexity index (PCI). The most sophisticated approach, PCI combines a TMS pulse (to perturb the brain) with EEG recording (to measure the response). It then computes the Lempel-Ziv complexity of the spatiotemporal pattern that follows the pulse. PCI captures not just the complexity of spontaneous activity but the brain's capacity to generate complex responses, which may be an even better proxy for consciousness.
| Entropy Metric | What It Measures | Best For | Limitation |
|---|---|---|---|
| Sample entropy | Predictability of pattern continuation | General EEG entropy analysis, real-time monitoring | Sensitive to parameter choices |
| Permutation entropy | Diversity of ordinal patterns | Fast computation, strong to noise | Loses amplitude information |
| Lempel-Ziv complexity | Signal compressibility | Psychedelic and anesthesia research | Requires binarization of signal |
| Perturbational complexity index | Complexity of brain's response to stimulation | Clinical consciousness assessment | Requires TMS equipment |
Why Entropy, Not Just Activity?
You might be wondering: why go through all this complexity math? Can't you just look at which brain regions are active?
Here's why that doesn't work. During certain types of seizures, brain activity is extremely high. Neurons are firing like crazy. But consciousness is absent. The person is not "there." Similarly, during REM sleep, many brain regions show activity levels comparable to waking, but the quality of consciousness is different, fragmented and dreamlike rather than coherent and reflective.
Activity alone can't distinguish between these states. Entropy can.
The reason has to do with something called differentiation and integration, two concepts that Giulio Tononi incorporated into his Integrated Information Theory (IIT) of consciousness. A conscious brain needs to be both differentiated (many different possible states) and integrated (those states are coordinated into a unified experience). Entropy captures differentiation: how many distinct patterns the brain can produce. And the spatial distribution of entropy across channels captures something about integration: how independently different brain regions are behaving.
When you're under anesthesia, brain activity isn't gone. It's still there, but it's become homogeneous, the same patterns repeating everywhere. The brain has lost its differentiation. When you're in a seizure, activity is high but it's also homogeneous, all neurons firing in lockstep. Again, differentiation collapses.
Consciousness, it turns out, requires not just an active brain but a complex brain. One that's doing many different things simultaneously while still holding it all together.
What This Means for Measuring Your Own Brain
Here's where things get practical.
You don't need a million-dollar fMRI scanner or a TMS lab to get meaningful entropy measurements from your brain. Multi-channel EEG is enough. The computational methods, sample entropy, permutation entropy, Lempel-Ziv complexity, all work on raw EEG data. And the differences between states are large enough to detect with consumer-grade hardware, provided you have enough channels and a high enough sampling rate.
This is why channel count matters so much for entropy analysis. A single-channel EEG device can give you a temporal entropy measure at one location. That's useful but limited. With 8 channels distributed across both hemispheres and all major lobes, you get a spatial map of complexity. You can see whether frontal entropy differs from parietal entropy. You can track how entropy changes propagate across the cortex. You can compute cross-channel measures that capture something about how independently different brain regions are behaving.
The Neurosity Crown's 8 EEG channels, positioned at CP3, C3, F5, PO3, PO4, F6, C4, and CP4, cover frontal, central, parietal, and parieto-occipital regions across both hemispheres. That distribution is well-suited for entropy analysis because it captures activity from all major cortical lobes. The raw signal data at 256Hz provides the temporal resolution needed for sample entropy and permutation entropy calculations, both of which require dense time-series data to produce reliable estimates.
And because the Crown's SDKs (JavaScript and Python) give you direct access to raw EEG data, you can pipe those signals into your own entropy analysis pipeline. Libraries like MNE-Python and Antropy (a Python library specifically designed for entropy and complexity analysis of neural signals) make the computation straightforward.
The Edge of Criticality: Where Consciousness Lives
There's one more piece of this story that pulls everything together, and it's the part that keeps physicists and neuroscientists up at night.
Complex systems in nature, from weather patterns to ecosystems to stock markets, tend to organize themselves near a state called criticality. This is the boundary between order and disorder, the narrow zone where a system is structured enough to maintain coherent patterns but flexible enough to generate novel ones. Systems at criticality show characteristic signatures: long-range correlations, power-law distributions, and maximal sensitivity to perturbation.
The healthy conscious brain shows every one of these signatures.
Power-law scaling in the size of neural avalanches. Long-range temporal correlations in EEG signals. Maximum sensitivity to sensory input. The conscious brain appears to be a system that has tuned itself to the edge of criticality, the exact spot where entropy is high enough to support rich information processing but not so high that everything dissolves into noise.
And here's what makes this profound. When consciousness is reduced, whether by anesthesia, sleep, or brain injury, the brain moves away from criticality. Under anesthesia, neural dynamics become subcritical: too ordered, too predictable, too locked into repetitive patterns. In certain seizure states, dynamics become supercritical: too random, too uncorrelated, too disorganized.
Consciousness, in this framework, isn't just correlated with a particular entropy level. Consciousness is what it feels like to be a complex adaptive system operating at criticality. The richness of your experience, the vividness of your perceptions, the depth of your thoughts, all of it tracks with how close your brain is to that edge.
And that's measurable. With EEG. In real time.
What Comes Next
The field of brain entropy research is still young. Carhart-Harris's original entropic brain hypothesis paper was published just over a decade ago, and the tools for computing entropy from neural signals are improving rapidly. But the implications are already enormous.
For clinical neuroscience, entropy-based measures offer a way to track consciousness in patients who can't report their own experience. This matters for anesthesia monitoring, for assessing patients with disorders of consciousness, and for tracking the neural effects of interventions in psychiatric conditions. Depression, for instance, has been associated with reduced brain entropy, a finding consistent with the subjective experience of depression as a narrowing of mental life, a loss of cognitive flexibility and experiential richness.
For meditation and contemplative practice, entropy provides an objective metric for something practitioners have described subjectively for centuries: the expansion of awareness. Experienced meditators consistently show altered entropy profiles compared to novices, and these changes correlate with self-reported depth of meditative experience.
For anyone interested in their own cognitive performance, entropy is a window into the quality of your conscious state that goes beyond simple metrics like "focused" or "relaxed." A high-entropy brain state during creative work might look very different from a high-entropy state during analytical problem-solving, and both look different from the elevated entropy of mind-wandering. Tracking these patterns over time could reveal things about your cognitive life that no mood survey or productivity app could capture.
We are still early in understanding what brain entropy means for individual human beings going about their daily lives. But the basic science is clear. The complexity of your brain's signals isn't just a mathematical curiosity. It's a fingerprint of your conscious experience. And for the first time in history, you can track that fingerprint outside of a lab, in your own home, while doing your own work.
That's not a small thing. The question of what consciousness is has haunted philosophy for millennia. The discovery that it leaves a measurable, quantifiable trace in the electrical activity of your brain, one that can be captured by sensors sitting on your scalp, is the kind of finding that deserves to keep you up at night. In a good way.

