The Hilbert Transform: EEG's Hidden Dimension
Your Brainwaves Have a Secret: They're Not Just Loud or Quiet. They're Somewhere in a Cycle.
Picture an ocean wave. At any given moment, you can describe two things about it. How tall it is (amplitude) and where it is in its cycle: rising, cresting, falling, or bottoming out in the trough (phase).
Now picture an EEG oscillation. Same deal. Your alpha rhythm isn't just "strong" or "weak." At every instant, it has an amplitude and a phase. The alpha brainwaves over your parietal cortex is somewhere in its cycle right now, 10 times per second, rising and falling with a regularity that would make a Swiss watchmaker jealous.
Here's the problem. The most common tool for analyzing EEG, the Fast Fourier Transform, can't tell you any of this. FFT works by chopping your signal into windows, typically half a second to several seconds long, and averaging across the entire window. It tells you how much alpha power you had over that stretch of time. That's useful. But it's like describing an ocean by saying "the average wave height over the last five seconds was 2 meters." You've lost the moment-by-moment story. You can't tell when the wave crested. You can't tell when it was in its trough. You've averaged away the dynamics.
The Hilbert transform gets them back.
It's a mathematical operation that, when applied to a narrowband EEG signal, gives you the instantaneous amplitude and the instantaneous phase at every single sample point. Not averaged over a window. At every sample. At 256Hz, that's 256 times per second.
And this distinction, between knowing the average power in a time window and knowing the exact amplitude and phase at each moment, turns out to be the difference between "I can see that someone has alpha activity" and "I can see exactly when their alpha wave peaks, track how its strength fluctuates cycle by cycle, and detect whether fast gamma oscillations are locking onto specific phases of that alpha rhythm."
That last part is called phase-amplitude coupling. It's one of the most important discoveries in modern neuroscience. And without the Hilbert transform, you literally cannot compute it.
The Analytic Signal: A Beautiful Mathematical Trick
To understand what the Hilbert transform actually does, you need one concept: the analytic signal. Don't let the name intimidate you. The intuition is surprisingly visual.
Take a pure sine wave. Plot it on paper. It goes up, down, up, down, tracing the familiar S-curve. Now imagine lifting that 2D wave off the paper and wrapping it around a cylinder, like a helix (think: a Slinky stretched out along a rod). Looking at the Slinky from the side, you'd see the original sine wave. But the Slinky itself lives in 3D. At any point along it, you can describe its position by two numbers: how high it is (the sine wave you started with) and how far around the cylinder it's gone.
The analytic signal does something equivalent, but in the world of complex numbers instead of physical 3D space. You take your original real-valued EEG signal and pair it with a 90-degree shifted copy of itself. The original signal becomes the real part. The shifted copy becomes the imaginary part. Together, they trace a spiral in the complex plane.
That 90-degree shifted copy is what the Hilbert transform computes.
Once you have this complex-valued analytic signal, extracting amplitude and phase is trivially easy. The amplitude at any moment is just the distance from the center of the spiral (the modulus of the complex number). The phase is the angle around the spiral (the argument of the complex number). No windows. No averaging. Just geometry.
Given a real-valued signal x(t), the analytic signal z(t) is:
z(t) = x(t) + i * H[x(t)]
where H[x(t)] is the Hilbert transform of x(t).
Instantaneous amplitude (envelope): A(t) = |z(t)| = sqrt(x(t)^2 + H[x(t)]^2)
Instantaneous phase: phi(t) = atan2(H[x(t)], x(t))
Instantaneous frequency: f(t) = (1 / 2pi) * d(phi)/dt
In Python with SciPy, the entire operation is one line: analytic = hilbert(filtered_signal)
Here's the "I had no idea" moment. The Hilbert transform doesn't add new information. It reorganizes the information that was already in your signal into a form where amplitude and phase are directly accessible. It's like the difference between having a pile of Lego bricks and having them assembled into a recognizable structure. Same bricks. Wildly different utility.
Why You Must Bandpass Filter First (This Is Not Optional)
There's a critical prerequisite that trips up nearly everyone who first tries the Hilbert transform on EEG data. You must bandpass filter the signal before applying the transform.
Here's why. Raw EEG is a mixture of oscillations at many different frequencies, all superimposed on each other. Delta, theta, alpha, beta, gamma, plus noise, plus muscle artifacts, plus 50/60 Hz line noise. The concept of "instantaneous phase" only makes physical sense when your signal has one dominant oscillation. If your signal contains alpha at 10 Hz and beta at 20 Hz simultaneously, what does "the phase" even mean? Phase of what?
The Hilbert transform doesn't know which oscillation you care about. Applied to raw, broadband EEG, it produces an amplitude envelope and phase that are a muddled combination of everything. The amplitude envelope will track the overall fluctuations of the mixed signal. The phase will jump erratically as different frequency components dominate from moment to moment.
The solution is simple: filter first, Hilbert second. If you want instantaneous alpha amplitude and phase, bandpass filter your signal to 8-13 Hz, then apply the Hilbert transform. If you want theta phase, filter to 4-8 Hz first. Each frequency band gets its own filter-then-Hilbert pipeline.
The choice of bandpass filter directly affects the quality of your Hilbert transform output. Use a zero-phase FIR filter (like scipy.signal.firwin applied with filtfilt) to avoid introducing phase distortions. IIR filters like Butterworth can work but may shift the phase of your signal, which defeats the purpose if you're trying to measure instantaneous phase precisely. For most EEG applications, a FIR filter with an order of 3x the sampling rate divided by the low cutoff frequency provides a good balance between frequency selectivity and temporal resolution.
This filter-then-Hilbert pipeline is so standard that it has a name in the EEG literature: the analytic signal approach to narrowband amplitude and phase extraction. Virtually every paper that discusses phase-amplitude coupling, phase-locking value, or phase-based BCI uses this exact method.
Three Applications That Depend on Instantaneous Phase
Now that you understand what the Hilbert transform does, let's look at why anyone should care. The three most important applications in modern EEG analysis all depend on having access to instantaneous amplitude and phase. None of them are possible with FFT alone.
Phase-Amplitude Coupling: The Brain's Cross-Frequency Conversation
This is the big one. Phase-amplitude coupling (PAC) is a phenomenon where the amplitude of a fast oscillation is modulated by the phase of a slow oscillation. The classic example: gamma amplitude (30-100 Hz) locked to theta phase (4-8 Hz) in the hippocampus during memory encoding.
What does that actually look like? Imagine theta as a slow ocean swell, rising and falling 6 times per second. Now imagine that bursts of fast gamma activity ride on top of that swell, but only at specific points. Gamma fires strongest when the theta brainwaves is rising through a particular phase angle, and it goes quiet during other phases. The fast rhythm is nested inside the slow rhythm.
This isn't a curiosity. It appears to be one of the fundamental mechanisms the brain uses to organize information. The prevailing theory is that the slow theta rhythm provides a temporal framework, dividing time into discrete windows, and gamma bursts within each window carry distinct packets of information. Different memories, different features of a scene, different items being held in working memory. Each one assigned to a different phase of the theta cycle.
To detect PAC, you need the instantaneous phase of the slow oscillation and the instantaneous amplitude of the fast oscillation. Both extracted via the Hilbert transform. You then test whether the amplitude values are non-uniformly distributed with respect to the phase values. The most common metric is the modulation index (Tort et al., 2010), which uses Kullback-Leibler divergence to quantify how strongly amplitude is coupled to phase.
| PAC Metric | What It Measures | Strengths | Limitations |
|---|---|---|---|
| Modulation Index (MI) | KL divergence of amplitude distribution across phase bins | Statistically principled, widely used | Sensitive to bin count, needs long data segments |
| Mean Vector Length (MVL) | Average of amplitude-weighted phase vectors | Intuitive geometric interpretation | Can be biased by amplitude outliers |
| Phase-Locking Value (PLV) for PAC | Consistency of phase difference between slow phase and fast amplitude envelope phase | Less sensitive to amplitude magnitude | Harder to interpret biologically |
| General Linear Model (GLM) | Regression of amplitude on phase using sinusoidal predictors | Handles covariates, flexible | Assumes linear relationship |
Phase-Based Brain-Computer Interfaces: Timing Is Everything
The second major application is using real-time phase information to build smarter BCIs.
Here's the insight that makes this powerful. Your brain's response to a stimulus depends on the phase of its ongoing oscillations at the moment the stimulus arrives. This isn't speculation. It's been demonstrated repeatedly since the early 2000s. A visual stimulus delivered at the peak of an occipital alpha oscillation is perceived differently than the same stimulus delivered at the trough. The perceptual threshold literally shifts depending on oscillatory phase.

For BCI designers, this opens up something remarkable: phase-triggered stimulation. Instead of delivering neurofeedback, audio, or visual stimuli at random or fixed intervals, you monitor the real-time phase of a target oscillation and trigger your stimulus at the optimal phase for the desired effect.
Researchers have already used this approach to enhance memory consolidation by delivering auditory tones synchronized to the up-state of slow oscillations during sleep (Ngo et al., 2013). Others have used real-time alpha phase tracking to optimize the timing of TMS pulses for maximal cortical excitability. The principle is the same everywhere: the brain has natural rhythmic windows of high and low excitability, and if you can track those windows in real time, you can deliver your intervention at exactly the right moment.
The computational pipeline for this is straightforward. Stream raw EEG, bandpass filter to your target band, apply the Hilbert transform to the most recent buffer of samples, extract the current phase, and trigger your stimulus when the phase crosses a threshold. The entire pipeline needs to run with low latency, typically under 20 milliseconds to stay within a useful phase window at alpha frequencies.
Functional Connectivity: Are Two Brain Regions Talking?
The third application is measuring how brain regions communicate. Specifically, whether two regions are oscillating in sync.
There are several ways to quantify this, and most of them depend on instantaneous phase. The most popular is the phase-locking value (PLV), introduced by Lachaux et al. in 1999. PLV measures how consistent the phase relationship between two signals is over time. If electrode A and electrode B always have a 45-degree phase difference at 10 Hz, their PLV is close to 1 (perfect phase-locking). If the phase difference wanders randomly, PLV is close to 0.
PLV is computed as follows: extract the instantaneous phase of the narrowband signal from each electrode (using the Hilbert transform), compute the phase difference at each time point, and then measure how concentrated those phase differences are. Mathematically, it's the magnitude of the mean of the unit-length complex vectors formed from the phase differences.
The closely related weighted Phase Lag Index (wPLI) adds robustness by ignoring phase differences near 0 and pi, which are most susceptible to volume conduction artifacts (the tendency for a single neural source to appear at multiple electrodes, creating spurious connectivity).
With 8 channels like the Neurosity Crown provides, you get 28 unique electrode pairs. That's 28 connectivity measurements per frequency band, covering frontal-parietal, frontal-occipital, left-right hemispheric, and other functionally meaningful connections. Enough to build a meaningful picture of how your brain's networks are coordinating.
| Connectivity Metric | Requires Hilbert | Handles Volume Conduction | Computational Cost |
|---|---|---|---|
| Coherence | No (FFT-based) | No | Low |
| Phase-Locking Value (PLV) | Yes | No | Low |
| Imaginary Coherence | No (FFT-based) | Yes | Low |
| Weighted Phase Lag Index (wPLI) | Yes | Yes | Moderate |
| Phase Transfer Entropy | Yes | Yes (directional) | High |
Implementing the Hilbert Transform: From Theory to Code
Let's make this concrete. The Hilbert transform is implemented in every major scientific computing library. In Python, SciPy's scipy.signal.hilbert function does the heavy lifting. Despite the function name, it actually returns the analytic signal, not the Hilbert transform itself. The instantaneous amplitude and phase are one line of NumPy away.
The typical pipeline for EEG analysis looks like this:
- Load raw EEG data from your recording device or file
- Bandpass filter to isolate the frequency band of interest
- Apply
scipy.signal.hilbertto get the analytic signal - Extract amplitude using
np.abs(analytic_signal) - Extract phase using
np.angle(analytic_signal) - Compute your metric (PAC, PLV, phase-triggered events, etc.)
For real-time applications, you work with overlapping buffers instead of a full recording. Each time a new buffer of samples arrives from the EEG stream, you bandpass filter it, apply the Hilbert transform, and extract the phase of the most recent samples. The key consideration is that the Hilbert transform needs some "padding" at the edges of each buffer to avoid boundary artifacts. A common approach is to keep a running buffer that's longer than your analysis window, apply the transform to the full buffer, and only use the phase values from the center portion.
At 256Hz, each sample represents about 3.9 milliseconds. An alpha oscillation at 10 Hz has a full cycle of 100 ms. To trigger a stimulus within a specific quarter of the alpha cycle (a 25 ms window), your total pipeline latency, from EEG sample acquisition through filtering, Hilbert transform, phase extraction, and stimulus delivery, needs to be well under 25 ms.
The Hilbert transform itself is fast (under 1 ms for typical buffer sizes). The bottleneck is usually the bandpass filter (FIR filters add group delay proportional to filter order) and the hardware output latency. Plan your filter order and buffer strategy around your latency budget, not the other way around.
The Crown's Raw EEG: Your Starting Point for Hilbert Analysis
Everything described above starts with one requirement: access to raw, sample-level EEG data. You can't compute a Hilbert transform on pre-processed power band data or averaged metrics. You need the actual voltage time series, sampled fast enough to capture the oscillations you care about.
The Neurosity Crown streams raw EEG at 256Hz from 8 channels positioned at CP3, C3, F5, PO3, PO4, F6, C4, and CP4. That's four channels per hemisphere, spanning frontal, central, centroparietal, and parieto-occipital regions across all four cortical lobes. The raw data is accessible through both the JavaScript SDK (for Node.js and browser applications) and the Python SDK.
For Hilbert-based analysis, this electrode montage is particularly useful because it provides inter-hemispheric pairs (F5/F6, C3/C4, CP3/CP4, PO3/PO4) for lateralized connectivity analysis, and frontal-posterior pairs (F5/PO3, F6/PO4) for measuring long-range phase synchrony between attention and sensory networks.
256Hz sampling rate is comfortably sufficient for Hilbert analysis of all standard EEG frequency bands. By the Nyquist theorem, you can accurately represent oscillations up to 128 Hz, which covers the full gamma range. For phase-amplitude coupling between theta (4-8 Hz) and gamma (30-80 Hz), 256Hz gives you roughly 3 to 8 samples per gamma cycle, which is adequate for phase estimation.
The on-device N3 chipset handles all signal acquisition and digitization, streaming the data over Bluetooth to your application. From there, you apply your bandpass filter and Hilbert transform in your language of choice. BrainFlow and Lab Streaming Layer (LSL) integrations give you additional pipeline flexibility if you prefer working within those ecosystems.
What Instantaneous Phase Reveals About Cognition
The reason neuroscientists care so much about instantaneous phase isn't just mathematical elegance. It's that phase carries information about cognition that amplitude alone cannot reveal.
Consider memory encoding. A series of studies starting with Rutishauser et al. (2010) has shown that items presented at certain phases of the hippocampal theta oscillation are more likely to be remembered later. The theta phase at the moment of encoding predicts subsequent memory performance. Not theta power, not alpha suppression, not any spectral metric. The phase.
Or consider perception. Mathewson et al. (2009) demonstrated that the detection of faint visual stimuli depends on the phase of posterior alpha at the moment the stimulus appears. Targets presented at the trough of alpha (when cortical excitability is highest) are detected more often than targets presented at the peak. This means your brain's ongoing oscillatory phase is literally gating what you can and cannot perceive. And the only way to study this gating is through the Hilbert transform.
Even brain-to-brain synchrony, measured in hyperscanning studies where two people interact while both wearing EEG, relies on Hilbert-derived phase metrics. When two people are engaged in conversation, their theta and alpha phases tend to synchronize. The degree of inter-brain phase coupling predicts mutual understanding and rapport better than any amplitude-based metric.
The Hilbert transform isn't just a signal processing technique. It's a lens that reveals how oscillatory timing coordinates the brain's computational architecture.
Common Pitfalls (And How to Avoid Them)
The Hilbert transform is deceptively simple to compute but easy to misapply. Here are the mistakes that catch even experienced researchers.
Pitfall 1: Applying the Hilbert transform to broadband data. This is the most common error. If your signal contains multiple frequency components, the instantaneous amplitude and phase will be a meaningless mixture. Always bandpass filter first. Always.
Pitfall 2: Using too narrow a filter bandwidth. On the other end of the spectrum, filtering too narrowly creates a different problem. A very narrow bandpass filter introduces temporal smearing, where the filtered signal's amplitude envelope changes slowly and artificially. The filter imposes its own dynamics on the envelope. A reasonable rule of thumb: the bandwidth of your filter should be at least 2 Hz wide for oscillations below 15 Hz, and at least 30-50% of the center frequency for higher bands.
Pitfall 3: Ignoring edge artifacts. The Hilbert transform, like any convolution-based operation, produces unreliable values at the beginning and end of a data segment. Discard at least one full cycle of the target frequency from each edge. For a 10 Hz alpha analysis, that means trimming at least 100 ms from each side.
Pitfall 4: Confusing instantaneous frequency with spectral frequency. The instantaneous frequency derived from the Hilbert transform (the time derivative of the instantaneous phase) can fluctuate wildly and even become negative if the signal isn't properly narrowband. This doesn't mean the brain is oscillating at negative frequencies. It means your filter wasn't selective enough. Instantaneous frequency is a useful concept but a fragile measurement.
Pitfall 5: Not accounting for volume conduction in connectivity. When computing PLV between nearby electrodes, volume conduction can create artificially high phase-locking at zero lag. Use metrics like wPLI or imaginary coherence that are strong to zero-lag coupling, or verify that your significant connectivity occurs at a non-zero phase lag.
Where This Is All Going
Phase-based neuroscience is still young, and the implications of what we're learning are stacking up fast.
Closed-loop neurostimulation, where interventions are timed to specific oscillatory phases in real time, is moving from research labs into clinical trials. If phase-triggered stimulation can enhance memory consolidation (it can), improve motor learning (early results say yes), and optimize the timing of therapeutic brain stimulation (multiple groups are working on this), then the ability to track instantaneous phase in real time becomes not just a research curiosity but a clinical tool.
And it doesn't require a hospital. The entire computational pipeline, bandpass filtering plus Hilbert transform plus phase extraction, runs on a laptop. The EEG acquisition runs on a device you can wear while working at a desk. The gap between "neuroscience research technique" and "something a developer can build into an application" has never been smaller.
What would you build if you could see your brain's oscillatory phase in real time? A meditation app that delivers tones synchronized to your alpha rhythm? A study tool that presents flashcards at the theta phase most conducive to memory encoding? A neurofeedback system that doesn't just reward power in a frequency band but rewards consistent phase-coupling between brain regions?
These aren't hypothetical. The math works. The hardware exists. The raw data streams at 256 samples per second, every second, from eight channels spanning both hemispheres. The Hilbert transform turns that stream into a real-time readout of amplitude and phase across every frequency band that matters.
Your brain is always oscillating. Now you can track exactly where in the cycle it is.

