Sample Entropy in EEG
The Most Important Thing About Your Brain Signals That Nobody Talks About
Here's something that should bother you about most EEG analysis.
When you put an EEG headset on your head, the standard approach is to break the signal into frequency bands. How much alpha? How much beta? What's your theta-to-beta ratio? This is useful. It tells you a lot about your brain state. But it misses something fundamental.
It misses the structure of the signal itself.
Think about it this way. Imagine two people's EEG recordings that have the exact same amount of alpha, beta, theta, and gamma power. Identical power spectra. A traditional frequency analysis would call these two brain states equivalent. But one person is wide awake, making complex decisions, and the other is slipping into early-stage dementia. Their brains look the same through the lens of frequency, but they're profoundly different through the lens of something else.
That something else is complexity. And the tool that measures it is called sample entropy.
Sample entropy is a single number that captures how unpredictable your brain signal is. Not how fast or slow the oscillations are. Not how powerful they are. How irregular they are. How surprising the next data point is given everything that came before it.
And this one number can tell the difference between consciousness and unconsciousness, between a young brain and an aging brain, between healthy cognition and the early stages of neurodegeneration. It does things that power spectral analysis simply cannot do.
Yet if you've been reading about EEG, there's a good chance you've never heard of it.
Why Regularity Matters: The Heartbeat Thought Experiment
Before we get into the math, let's build some intuition.
Think about your heartbeat. A healthy heart doesn't beat like a metronome. It varies. The time between one beat and the next is slightly different from the time between the next two beats. This variability, called heart rate variability (HRV), is actually a sign of health. A heart that beats with rigid, clockwork precision is often a sick heart.
The same principle applies to your brain.
A healthy, awake brain produces signals that are complex, variable, and hard to predict. The voltage at one moment gives you limited information about what the voltage will be 10 milliseconds later. There's structure, but it's layered, intricate, and constantly shifting.
Now consider a brain under general anesthesia. The signal becomes rhythmic, repetitive, almost monotonous. Patterns start repeating. If you see a particular sequence of voltages, you can make a pretty good guess about what comes next. The signal has become predictable.
This is the core insight behind entropy analysis: complexity reflects the richness of information processing. A brain doing complex things produces complex signals. A brain doing less, whether because of sleep, anesthesia, or disease, produces simpler, more repetitive ones.
Sample entropy is how we put a number on that.
Pattern Matching: How Sample Entropy Actually Works
The math behind sample entropy is more intuitive than you might expect. Here's the logic, step by step.
You start with a raw EEG signal: a long sequence of voltage values, sampled at some rate (256 times per second for the Neurosity Crown, for example). Your job is to figure out how repetitive this sequence is.
Step 1: Pick a pattern length. You choose a value called m, the embedding dimension. This is the length of the short patterns you'll be comparing. For EEG, m is almost always set to 2. That means you'll be looking at patterns that are 2 data points long.
Step 2: Pick a tolerance. You choose a value called r, the tolerance threshold. This defines how close two data points need to be to count as "matching." For EEG, r is typically set to 0.1 to 0.25 times the standard deviation of the signal. This accounts for the fact that biological signals are noisy, so you don't require an exact match, just a close one.
Step 3: Count the matches at length m. Slide a window of length m along the entire signal. For each position, you now have a short template pattern. Compare that template to every other template of the same length. Count how many of them match (meaning every data point in the pattern is within the tolerance r of the corresponding point in the template). Call this count B.
Step 4: Count the matches at length m+1. Now do the same thing, but with patterns that are one data point longer (length m+1, which is 3 if m is 2). This is the critical step. You're asking: of all the pairs that matched at length 2, how many of them still match when you extend the pattern by one more point?
Call this count A.
Step 5: Calculate sample entropy. The formula is:
SampEn = -ln(A / B)
That's the negative natural logarithm of the ratio of matches at length m+1 to matches at length m.
If A is close to B (most patterns that match at length 2 also match at length 3), then A/B is close to 1, ln(1) is 0, and sample entropy is low. The signal is predictable. Knowing the current pattern tells you what comes next.
If A is much smaller than B (patterns that match at length 2 often diverge at length 3), then A/B is small, ln(small number) is a large negative number, and the negative sign makes sample entropy high. The signal is unpredictable. Even when you see a familiar pattern, you can't predict what follows.
That's the whole algorithm. It's beautiful in its simplicity. You're measuring one thing: how often does knowing the present help you predict the future? In a complex, information-rich signal, the answer is "not very often." In a simple, repetitive signal, the answer is "almost always."
The Parameters Matter: Choosing m and r
One thing that trips people up is that sample entropy isn't a single, absolute value like temperature. It depends on the parameters you choose, particularly m and r.
| Parameter | Typical EEG Value | What It Controls |
|---|---|---|
| m (embedding dimension) | 2 | The length of template patterns. Higher m looks at longer patterns but needs more data to be reliable. |
| r (tolerance) | 0.1 to 0.25 times SD | How close values must be to count as a match. Too low and nothing matches. Too high and everything matches. |
| N (data length) | 256 to 1024 points | The number of data points in the epoch. More data gives more stable entropy estimates. |
The convention in EEG research is m=2 and r=0.15 times the standard deviation of the signal. These values work well for EEG because the signal has enough structure at length-2 patterns to be informative, and the tolerance scales with the signal's amplitude, making results comparable across different participants and sessions.
Here's the important part: you must keep these parameters identical across any comparisons. Comparing sample entropy values computed with different m and r values is meaningless. It's like comparing temperatures measured in Celsius and Fahrenheit without converting.
Sample entropy was introduced by Richman and Moorman in 2000 as a refinement of an earlier measure called approximate entropy (ApEn), developed by Pincus in 1991. The problem with approximate entropy is that it counts self-matches. Every pattern is compared to itself, which always counts as a match, and this introduces a bias that makes the result dependent on the length of the data. Sample entropy fixes this by excluding self-matches. The result is a measure that's more consistent across different data lengths and more reliable for short time series, which is exactly what you need for EEG analysis where epochs are often just a few seconds long.
What High and Low Entropy Tell You About the Brain
Now for the payoff. What does sample entropy actually reveal about brain function?
High Sample Entropy: A Brain That's Working Hard
When sample entropy is high, the EEG signal is complex and irregular. This typically means the brain is engaged in rich, distributed processing. Neurons across different regions are firing in complex, non-repetitive patterns. Information is flowing along diverse pathways.
You see high sample entropy during:
- Alert wakefulness, especially during cognitively demanding tasks
- Active problem-solving and decision-making
- Healthy brain function in young and middle-aged adults
- Creative thinking, when the brain is making novel associations
Low Sample Entropy: A Brain That's Repeating Itself
When sample entropy is low, the EEG signal is regular and predictable. Large populations of neurons have fallen into lockstep, producing repetitive patterns. The diversity of information processing has narrowed.
You see low sample entropy during:
- Deep sleep, when slow, rhythmic delta brainwaves dominate
- General anesthesia, when consciousness is chemically suppressed
- Epileptic seizures, when neural populations synchronize abnormally
- Advanced neurodegenerative disease, when neural networks have been damaged
The pattern is clear: complexity tracks with consciousness and cognitive capacity. More complex signals mean more information processing. More regular signals mean less.
Anesthesia Monitoring: Where Entropy Saved Lives
One of the most powerful applications of sample entropy is in the operating room. And the story of how it got there is a reminder that mathematical abstractions can have profoundly practical consequences.
General anesthesia is one of the most common medical procedures on the planet. Millions of surgeries happen every year, and for each one, an anesthesiologist must keep the patient in a precise window: unconscious enough to feel no pain, but not so deeply sedated that their vital functions are compromised.
Here's the terrifying part: for decades, anesthesiologists had no reliable, real-time measure of consciousness. They monitored heart rate, blood pressure, and muscle relaxation. But these are indirect indicators. It's possible for a patient to be hemodynamically stable (normal heart rate and blood pressure) while being partially aware during surgery. This is called intraoperative awareness, and it happens in roughly 1 to 2 cases per 1,000 surgeries. For the patients it happens to, it can be deeply traumatic.
Entropy-based EEG monitoring changed this. Devices like the GE Datex-Ohmeda Entropy Module compute entropy values from frontal EEG in real time, giving the anesthesiologist a continuous readout of brain signal complexity. When entropy drops below a threshold, the brain has transitioned into unconsciousness. When it starts climbing, the patient may be getting lighter.

The key discovery that made this possible: sample entropy of frontal EEG reliably decreases as anesthetic depth increases, regardless of which anesthetic drug is used. Propofol, sevoflurane, desflurane, all of them produce the same pattern. Consciousness goes down, entropy goes down. This universality is what made entropy a viable clinical tool. It doesn't measure the drug. It measures the brain's state.
Aging and the Complexity Cliff
Here's an "I had no idea" fact that genuinely changed how I think about the brain.
Starting around age 60, the sample entropy of resting-state EEG begins to decline. Not dramatically, not suddenly, but measurably. Year by year, the brain's signals become slightly more regular, slightly more predictable, slightly less complex.
This is not just a curiosity. It tracks with real cognitive changes.
A 2005 study by McIntosh and colleagues in Neuron demonstrated that brain signal complexity (measured using a related metric called multiscale entropy) correlates with behavioral variability and cognitive performance. Older adults with more complex brain signals performed more consistently and accurately on cognitive tasks. Those with less complex signals showed more variable, less accurate performance.
The interpretation is striking. A young, healthy brain maintains a high-entropy state because its neural networks are intact, richly connected, and capable of producing a wide repertoire of activation patterns. As aging, neurodegeneration, or disease damages these networks, the repertoire shrinks. The brain falls into lower-dimensional patterns. Fewer pathways, fewer possible states, less complexity, lower entropy.
Several conditions are associated with reduced EEG sample entropy compared to healthy, age-matched controls:
- Alzheimer's disease: One of the earliest and strongest findings. EEG complexity declines before clinical symptoms become apparent, making it a potential early biomarker.
- Vascular dementia: Similar complexity reductions, particularly over temporal and parietal regions.
- Traumatic brain injury: Acute and chronic reductions in entropy, correlating with injury severity.
- Epilepsy: Dramatic entropy drops during seizures, with potential for pre-seizure detection based on entropy decline.
- Major depression: Modest but significant complexity reductions, particularly in frontal regions.
This idea, that brain complexity is a kind of "cognitive reserve" indicator, is still being actively researched. But the data is accumulating fast. Sample entropy and its relatives may eventually serve as early warning systems for cognitive decline, catching changes years before traditional cognitive tests reveal a problem.
Beyond the Frequency Spectrum: What Entropy Captures That Power Can't
If you're familiar with EEG frequency analysis, you might be wondering: why do we need entropy at all? Can't we just look at power spectral density?
The short answer is no. And the reason is important.
Power spectral analysis decomposes a signal into its frequency components. It tells you how much energy exists at each frequency. This is incredibly useful. But it makes an assumption: it treats the signal as if it were a sum of sine waves. It captures the linear structure of the signal.
Sample entropy captures something different. It measures the nonlinear structure, the patterns and regularities that exist in the raw time series regardless of which frequencies are present.
| Feature | Power Spectral Analysis | Sample Entropy |
|---|---|---|
| What it measures | Energy at each frequency | Regularity of time-series patterns |
| Signal model | Sum of sine waves (linear) | Pattern matching (nonlinear) |
| Sensitive to | Oscillatory power changes | Changes in signal complexity and regularity |
| Misses | Nonlinear dynamics, coupling, complexity | Frequency-specific information |
| Best for | Identifying dominant brain rhythms | Tracking consciousness, detecting pathology, measuring cognitive capacity |
| Typical EEG use | Alpha/beta/gamma power, band ratios | Anesthesia depth, aging biomarker, seizure detection |
Think of it this way. A shuffled deck of cards and a sorted deck of cards contain the same cards (same "power spectrum"). But their arrangement is completely different. Entropy captures arrangement. Power spectral analysis captures content.
In practice, the most informative EEG analyses combine both approaches. Frequency analysis tells you which oscillations are present. Entropy analysis tells you how those oscillations are organized and how complex the overall brain dynamics are. Together, they give you a far richer picture than either one alone.
Computing Sample Entropy From Real EEG Data
If you're a developer or researcher who wants to compute sample entropy from actual EEG recordings, the process is surprisingly straightforward.
Here's the general workflow:
-
Acquire raw EEG data. You need unprocessed voltage time series, not power spectral density, not frequency band averages, and not focus scores. You need the raw numbers. The Neurosity Crown gives you access to raw EEG at 256Hz across all 8 channels through the JavaScript and Python SDKs.
-
Preprocess the signal. Apply a bandpass filter (typically 0.5 to 45 Hz for general entropy analysis) to remove DC drift and high-frequency noise. Remove obvious artifacts like eye blinks or large muscle movements. Segment the continuous recording into epochs, typically 1 to 4 seconds long.
-
Set your parameters. Choose m=2 and r=0.15 times the standard deviation of each epoch. Stick with these values across all your analyses for consistency.
-
Compute sample entropy. Use a library. In Python, the
EntroPypackage by Raphael Vallat provides a clean implementation:entropy.sample_entropy(signal, order=2, metric='chebyshev'). Thenoldslibrary andantropyare also popular choices. -
Interpret the results. Typical EEG sample entropy values (with m=2, r=0.15*SD) range from about 0.2 (very regular, as in deep anesthesia or seizure) to about 2.0 or higher (very complex, as in alert wakefulness during a demanding task). Values around 1.0 to 1.5 are common for relaxed wakefulness.
Sample entropy needs enough data points to find patterns. With m=2, you generally want at least 200 data points for a stable estimate. At 256Hz, that's roughly one second of data. For more reliable results, 2 to 4 seconds (512 to 1024 data points) is better. If your epochs are too short, entropy estimates become noisy and unreliable. If they're too long, you risk mixing different brain states within a single epoch.
The beauty of working with raw EEG is that you're not limited to the metrics that a device manufacturer chose to pre-compute. You can compute sample entropy, approximate entropy, multiscale entropy, permutation entropy, or any other complexity measure you want. All you need is the raw signal and some basic signal processing knowledge.
Multiscale Entropy: The Zoom Lens
Once you understand sample entropy, a natural question arises: at what timescale is the brain complex?
Standard sample entropy computes regularity at the timescale of your sampling rate. If you're sampling at 256Hz, you're looking at patterns that play out over a few milliseconds. But the brain operates at many timescales simultaneously. Some processes unfold over milliseconds, others over hundreds of milliseconds, others over seconds.
Multiscale entropy (MSE), introduced by Costa, Goldberger, and Peng in 2002, addresses this. The idea is to "coarse-grain" the signal at progressively larger scales before computing sample entropy at each scale.
At scale 1, you compute sample entropy on the original signal. At scale 2, you average every two consecutive data points together, creating a new signal at half the temporal resolution, then compute sample entropy. At scale 3, you average every three points. And so on.
The result is an entropy curve: sample entropy plotted as a function of timescale. And these curves have fascinating signatures.
Healthy young brains show high entropy across many scales. The signal is complex whether you look at fast dynamics or slow dynamics. Aging brains and pathological brains often show reduced entropy at larger scales, meaning they lose complexity specifically in their slower dynamics. This suggests that age-related complexity loss isn't about fine-grained neural firing becoming simpler. It's about the large-scale coordination of brain networks breaking down.
Multiscale entropy curves can distinguish between healthy aging and Alzheimer's disease with accuracy that rivals conventional neuropsychological testing. The technique is still primarily a research tool, but it's approaching clinical utility.
The Philosophy of Complexity: What Does Entropy Really Mean?
Here's where this gets genuinely mind-bending.
There's a concept in consciousness research called the "complexity theory of consciousness." It proposes that consciousness isn't about any specific brain region or frequency band. It's about the overall complexity of brain dynamics. A brain is conscious to the degree that it generates information that is simultaneously differentiated (many different possible states) and integrated (all those states are part of one unified system).
Sample entropy, while not a direct measure of consciousness in the philosophical sense, captures something closely related. It measures how many different patterns the brain can produce and how unpredictable those patterns are. A brain that can produce many different states (high entropy) is, by this framework, a brain that is processing a lot of information. A brain stuck in repetitive loops (low entropy) is processing less.
This connection between complexity and consciousness is why entropy measures work so well for anesthesia monitoring. They're not measuring any specific neural mechanism of anesthesia. They're measuring the consequence of losing consciousness itself: the collapse of complex neural dynamics into simpler, more repetitive patterns.
And this has implications far beyond the operating room. If brain signal complexity is a biomarker for consciousness and cognitive capacity, then tracking it over time could tell you something profound about your own brain health. Not whether you're focused or relaxed (frequency analysis handles that), but whether your brain is maintaining its full repertoire of possible states. Whether the orchestra is still playing all its instruments, or whether sections are going quiet.
Where the Field Is Heading
Sample entropy and its relatives are moving from research curiosities to practical tools. Here's what to watch.
Real-time entropy biofeedback. Just as neurofeedback uses frequency-band power to train brain states, researchers are exploring whether entropy-based feedback could train the brain to maintain higher complexity. Early results are preliminary, but the concept is compelling: instead of training "more alpha" or "less theta," you train "more complexity."
Early detection of neurodegeneration. Multiple research groups are working on EEG complexity metrics as screening tools for Alzheimer's disease and mild cognitive impairment. The appeal is enormous: a 5-minute EEG recording and an entropy calculation, versus expensive PET scans or invasive cerebrospinal fluid sampling.
Sleep quality assessment. Entropy measures track sleep stages differently than traditional frequency analysis. They may capture aspects of sleep quality, specifically the richness and complexity of neural processing during sleep, that power spectral analysis misses.
Brain-computer interfaces. Adding entropy features to BCI classification algorithms can improve performance, because entropy captures information about the user's brain state that frequency features alone do not.
The tools to explore all of this are available right now. A consumer EEG device with raw data access, a Python environment, and a few lines of code. That's the barrier to entry. It has never been lower.
Every Signal Has a Story. Complexity Is How You Read It.
The frequency content of your EEG tells you what your brain is doing. The entropy of your EEG tells you how well your brain is doing it.
That distinction matters. A brain generating high alpha power might be relaxed, or it might be disengaged. A brain generating lots of beta might be focused, or it might be anxious. Frequency alone can't always tell the difference. But when you combine frequency analysis with entropy analysis, the picture gets dramatically clearer. You're not just measuring the instruments in the orchestra. You're measuring how coordinated and complex the performance is.
Sample entropy is one of those tools that, once you understand it, you can't unsee its relevance. Every EEG signal you look at, you'll start wondering: how complex is this? How predictable? What is the entropy telling me that the power spectrum isn't?
Your brain is generating a signal right now that contains far more information than any frequency decomposition can capture. The tools to measure that information exist. The math is a few dozen lines of code. And the things it reveals about consciousness, about aging, about the fundamental nature of what it means for a brain to be working well, are only beginning to be understood.
The most interesting question in neuroscience might not be which frequency your brain runs on. It might be how unpredictable it dares to be.

