When Brain Regions Oscillate in Lockstep
Your Brain Has a Timing Problem, and It Solved It Brilliantly
Here's something that should bother you if you think about it long enough.
Your visual cortex processes what you see. Your auditory cortex processes what you hear. Your prefrontal cortex decides what to pay attention to. Your hippocampus files everything into memory. These regions are spread across your skull, connected by long axonal fibers that introduce real transmission delays. A signal traveling from your occipital lobe to your frontal cortex takes somewhere around 50 to 80 milliseconds to arrive.
And yet, when you watch someone pluck a guitar string, you perceive the sight and sound as a single unified event happening at a single moment. Your brain stitches together information from scattered regions, processed at different speeds, arriving at different times, into one smooth experience.
How?
For decades, this was one of neuroscience's most persistent puzzles. Researchers called it the "binding problem," and the leading solutions were all frustratingly vague. There had to be some mechanism that linked distributed brain activity into a coherent whole. Something that told your brain, "These signals from these different regions belong together. Process them as one thing."
That mechanism, it turns out, is timing. Not the content of neural signals, but precisely when they fire relative to each other. Two brain regions that oscillate with a consistent timing relationship are, in a very real sense, coupled. Linked. Talking to each other. And the measurement that captures this relationship has a name: phase synchrony.
What Phase Actually Means (And Why It's the Key to Everything)
Before phase synchrony makes sense, you need to understand what "phase" means for a brain oscillation. It's simpler than it sounds.
Every oscillation is a repeating cycle. Up, down, up, down. Like a pendulum, or a sine wave, or the second hand on a clock. At any given instant, that oscillation is somewhere in its cycle. It might be at the peak, the trough, halfway up, or anywhere in between. The phase is just a number that tells you where in the cycle the oscillation is right now.
Scientists measure phase in degrees (0 to 360) or radians (0 to 2pi). A phase of 0 degrees means the wave is starting a new cycle. A phase of 180 degrees means it's at the exact opposite point, the trough. A phase of 90 degrees means it's at the peak.
Now imagine you're recording EEG from two different electrodes, one over the left frontal lobe and one over the right parietal lobe. Both regions are oscillating at 10 Hz (alpha rhythm). At each moment in time, each signal has a phase. You can compute the difference between those two phases.
Here's where it gets interesting.
If that phase difference stays constant over time, say, the frontal signal is always exactly 45 degrees ahead of the parietal signal, then the two regions are phase-synchronized. They're oscillating at the same frequency with a locked timing relationship. It's like two pendulums swinging at the same speed, with one always slightly ahead of the other. The offset is fixed. Predictable. Stable.
If, on the other hand, the phase difference wanders randomly from moment to moment, sometimes the frontal signal leads, sometimes it lags, sometimes they're aligned and sometimes they're opposite, then the regions are not synchronized. They happen to be oscillating at the same frequency, but there's no stable timing relationship between them. Two clocks ticking at the same speed but drifting in and out of alignment.
The critical insight: phase synchrony captures something that power analysis completely misses. Two regions can have identical amounts of alpha power, identical frequency peaks, identical amplitudes, and still have zero phase synchrony. They look the same on a power spectrum. But the timing analysis reveals they're operating independently. They're two musicians playing the same tempo who never once look at each other.
Measuring Phase Synchrony: PLV, PLI, and the Volume Conduction Problem
So how do you actually quantify phase synchrony from EEG data? Over the past 25 years, researchers have developed several metrics, each solving a different problem. The two most important are the phase-locking value (PLV) and the phase lag index (PLI).
Phase-Locking Value: The Straightforward Approach
The phase-locking value, introduced by Lachaux and colleagues in 1999, is the most intuitive measure. Here's the logic:
- Record EEG from two electrodes
- Filter both signals to the frequency band you care about (say, 8-13 Hz for alpha)
- Use a mathematical technique called the Hilbert transform to extract the instantaneous phase of each signal at every time point
- Compute the phase difference between the two signals at every time point
- Ask: how consistent is that phase difference?
If the phase difference is the same (or close to the same) at every time point, PLV is close to 1. Perfect synchrony. If the phase difference is all over the place, PLV is close to 0. No synchrony.
Mathematically, PLV is the length of the average vector when you plot each phase difference as a unit vector on a circle. If all vectors point roughly the same direction (consistent phase difference), the average vector is long. If they scatter in all directions (random phase difference), the average vector shrinks toward zero.
| Metric | Range | What It Measures | Strengths | Weaknesses |
|---|---|---|---|---|
| PLV (Phase-Locking Value) | 0 to 1 | Consistency of phase difference over time | Intuitive, sensitive, widely used | Susceptible to volume conduction artifacts |
| PLI (Phase Lag Index) | 0 to 1 | Asymmetry of phase difference distribution | Strong to volume conduction | Less sensitive, may miss genuine synchrony near 0-lag |
| wPLI (Weighted PLI) | 0 to 1 | Weighted asymmetry of phase difference | More sensitive than PLI while still strong | More complex to implement |
| Imaginary Coherence | -1 to 1 | Imaginary part of coherency | Ignores zero-lag interactions | Can be negative, harder to interpret |
| dwPLI (Debiased wPLI) | 0 to 1 | Bias-corrected weighted PLI | Minimizes sample size bias | Computationally expensive for large datasets |
PLV is elegant and powerful. But it has a flaw that took the field years to fully reckon with.
The Volume Conduction Problem: When Your Skull Lies to You
Here's the issue. Your skull doesn't just sit passively between your brain and the EEG electrodes. It acts as a conductor. Electrical signals produced by a single neural source spread through the skull, the cerebrospinal fluid, and the scalp, arriving at multiple electrodes simultaneously. This is called volume conduction.
And it's a problem for phase synchrony because it creates the illusion of synchronization where none exists.
Think about it. If one neural source in the center of your brain radiates outward, its signal arrives at electrode A and electrode B at the same time. When you compute the phase difference, it's zero. Perfectly locked. PLV is sky-high. But this doesn't reflect two regions communicating. It reflects one region being picked up by two sensors.
This isn't a minor technical nuisance. In some EEG studies, volume conduction artifacts have accounted for a significant proportion of reported "functional connectivity." You think you're measuring a connection between the frontal and parietal lobes, but you're actually measuring a deep source that both electrodes happen to be close to.
Phase Lag Index: The Fix
This is where the phase lag index comes in. Developed by Stam, Nolte, and Daffertshofer in 2007, PLI was designed specifically to solve the volume conduction problem. Its logic is clever:
Volume conduction produces signals that arrive at two electrodes with either a zero phase difference or a 180-degree phase difference (when the electrodes are on opposite sides of the source). So PLI ignores those cases entirely. It only counts phase differences that are consistently asymmetric: mostly positive or mostly negative, but never hovering right around zero or pi.
If the phase difference between two signals is always slightly positive (signal A consistently leads signal B by some non-zero amount), that asymmetry can't be explained by volume conduction. It implies a genuine lagged connection, one region truly sending information to another with a real transmission delay.
PLI sacrifices some sensitivity for this robustness. It will miss real synchrony that happens to have near-zero lag (which can occur between very close regions). But for long-range connectivity studies, where volume conduction is the biggest confound, PLI has become the gold standard.
A 2012 study by Vinck and colleagues demonstrated that roughly 40% of what earlier EEG studies reported as "long-range functional connectivity" could be explained by volume conduction alone. The introduction of PLI and similar metrics didn't just improve measurement accuracy. It invalidated a substantial chunk of prior connectivity research. Science occasionally has to audit itself, and the shift from PLV to PLI was one of those moments where the field realized it had been partly measuring artifacts, not brains.
Synchrony and Desynchrony: Both Are Signals
There's a natural temptation to assume that more synchrony is always better. Two brain regions locked in perfect phase alignment must be good, right? Desynchrony must be dysfunction?
Not even close. The brain uses both synchrony and desynchrony as active computational tools, and understanding when each is appropriate is one of the most nuanced questions in neuroscience.
When Synchrony Is the Signal
Phase synchrony increases when two brain regions need to share information for a specific task. Some of the strongest findings:
Attention. When you focus on a visual stimulus, phase synchrony between frontal control regions and visual cortex increases in the alpha and gamma bands. Your prefrontal cortex is essentially "tuning in" to the visual areas processing the thing you're looking at. Fries (2005) called this "communication through coherence": the idea that two regions can only exchange information effectively when their oscillations are aligned, so that input from one region arrives at the other during its receptive phase. If the timing is off, the signal gets dampened. If the timing is right, it gets amplified.
Working memory. Holding items in working memory increases theta-band phase synchrony between the hippocampus and prefrontal cortex. This frontal-hippocampal theta coupling appears to be the mechanism by which executive control systems access and maintain memory representations. The more items you hold in working memory, the stronger this synchrony becomes, until you hit capacity.
Conscious perception. In binocular rivalry experiments (where each eye sees a different image and perception alternates between them), the moment of perceptual switch is preceded by a burst of gamma-band phase synchrony across widespread cortical regions. The winning percept, the image that enters awareness, is the one whose neural representation achieves large-scale phase coupling first.
When Desynchrony Is the Signal
Equally fascinating: sometimes the brain deliberately breaks synchrony to do its job.
Motor execution. Before you move your hand, the sensorimotor cortex shows a dramatic desynchronization in the beta band. This is called event-related desynchronization (ERD), and it reflects the local cortical networks "unlocking" from their resting state to generate a movement command. After the movement, beta synchrony returns, a phenomenon called event-related synchronization (ERS) or "beta rebound." The brain breaks the lock to act, then restores it when it's done.
Cognitive flexibility. Excessive phase synchrony across the brain can actually be pathological. In certain forms of epilepsy, neurons across large cortical areas become hypersynchronized, oscillating in lockstep when they shouldn't be. The result is a seizure. Too much synchrony, paradoxically, is a loss of information processing capacity. The brain needs some regions to be independent so they can process different information in parallel.
Creative insight. Some research suggests that the moment before a creative breakthrough involves a transient decrease in long-range phase synchrony, as if the brain is loosening its usual associative patterns to allow novel combinations to emerge, followed immediately by a surge of synchrony as the new insight crystallizes.
The takeaway: your brain is constantly adjusting the synchrony between its regions like a mixing board, strengthening connections that serve the current task and weakening ones that would interfere. The pattern of synchrony and desynchrony across the whole brain at any given moment is, in a real sense, a snapshot of what that brain is doing.

Phase Synchrony and the Big Questions
Attention: How Your Brain Selects a Signal From the Noise
Pascal Fries's "communication through coherence" theory, published in 2005 and refined over the following decade, has become one of the most influential frameworks in cognitive neuroscience. The core idea is deceptively simple: neural communication depends on phase alignment.
Here's the intuition. A receiving neuron has windows of excitability that open and close rhythmically, in sync with its local oscillation. When input arrives during the excitable window (the right phase), it has a strong effect. When input arrives during the refractory window (the wrong phase), it's ignored.
So if you're a brain region trying to send information to another region, you need your output to arrive during the receiver's excitable phase. And the way to achieve that is to oscillate at the same frequency, with the right phase offset. Phase synchrony isn't just a marker of communication. It's the mechanism.
This explains something that had puzzled attention researchers for years: how the brain "selects" one input stream over another without physically disconnecting anything. The answer is phase synchrony. When you attend to a visual stimulus, the visual area processing that stimulus synchronizes with frontal attention networks. Other visual areas, processing the stuff you're ignoring, desynchronize. Nothing is disconnected. The unattended information still reaches the frontal cortex. But because it arrives at the wrong phase, its impact is blunted.
Selective attention, in this view, is largely a synchronization operation.
Memory: The Phase Code
Working memory has a capacity limit. Most people can hold about four to seven items in mind simultaneously. For a long time, nobody could explain why that number and not some other number.
Phase synchrony offered a breathtaking answer.
Multiple independent studies have shown that items in working memory are represented by bursts of gamma oscillations that are nested within the slower theta rhythm. Each item gets its own gamma burst, and these bursts are separated in time by the phase of the theta cycle. Item one fires at the peak of theta, item two fires on the downslope, item three fires at the trough, and so on.
The limit? You can only fit so many gamma bursts into one theta cycle before they start overlapping. A theta cycle at 6 Hz lasts about 167 milliseconds. A gamma burst at 40 Hz lasts about 25 milliseconds. That gives you room for roughly six or seven non-overlapping gamma packets per theta cycle. Sound familiar?
This "theta-gamma phase-amplitude coupling" isn't just a correlation. Disrupting it experimentally (using transcranial stimulation to shift theta phase) impairs working memory performance. Enhancing it improves performance. The phase relationship between theta and gamma literally is the memory code.
Consciousness: The Synchrony That Makes You You
Perhaps the most provocative application of phase synchrony research is in the study of consciousness itself.
Here's the pattern that keeps emerging: conscious awareness is associated with widespread, long-range phase synchrony, particularly in the gamma band. And when consciousness dims or disappears, that synchrony breaks down.
During general anesthesia, one of the first things to collapse is frontal-to-parietal gamma synchrony. The individual regions don't necessarily stop oscillating. They lose their timing relationship. It's as if the brain's long-distance phone lines go dead while the local circuits keep running.
In patients with disorders of consciousness, the degree of preserved phase synchrony is one of the strongest predictors of whether someone will recover. Patients in a vegetative state who still show some inter-regional phase coupling have a significantly better prognosis than those who don't. The neural machinery is still there. Some of the timing is still intact. And where timing survives, so does the possibility of awareness.
Giulio Tononi's Integrated Information Theory (IIT) formalizes this intuition. IIT proposes that consciousness arises from the integration of information across a network, and that integration requires precisely the kind of coordinated timing that phase synchrony measures. In this framework, phase synchrony isn't just correlated with consciousness. It's part of the computational substrate that generates it.
This remains a hypothesis, not established fact. But the consistency of the evidence across anesthesia, sleep, brain injury, psychedelics, and meditation is striking. Phase synchrony keeps showing up wherever the boundary of consciousness is.
The relationship between phase synchrony and cognition has been studied across dozens of domains. Here are some of the most replicated findings:
- Sustained attention: Increased theta and alpha phase synchrony between frontal and parietal regions, reflecting the "attention network" maintaining its grip
- Visual binding: Gamma-band synchrony between visual cortical areas increases when features are perceived as part of the same object
- Language comprehension: Theta-band synchrony between left temporal and frontal regions increases during sentence processing
- Emotional regulation: Alpha-band synchrony between prefrontal cortex and amygdala increases during successful emotion regulation
- Long-term memory encoding: Theta synchrony between hippocampus and neocortex predicts which experiences will be remembered
- Meditation: Experienced meditators show dramatically elevated gamma synchrony across widespread cortical regions, the same pattern first observed in Tibetan Buddhist monks by Lutz and colleagues in 2004
Measuring Phase Synchrony With Consumer EEG
For most of the history of phase synchrony research, computing these metrics required expensive laboratory equipment, extensive electrode arrays, and offline analysis in MATLAB or Python. The data went in one end and the results came out hours or days later. Real-time phase synchrony measurement was a luxury reserved for a handful of well-funded labs.
That constraint is disappearing. And this is where things get practical.
The Neurosity Crown provides 8 EEG channels at positions CP3, C3, F5, PO3, PO4, F6, C4, and CP4. These positions span frontal, central, centroparietal, and parieto-occipital regions across both hemispheres. With 8 channels, you get 28 unique electrode pairs, and each pair can yield a phase synchrony value at every frequency band.
This coverage matters because the most scientifically and cognitively relevant phase synchrony connections involve exactly these regions:
| Connection | Electrode Pair | Frequency Band | What It Reflects |
|---|---|---|---|
| Frontal-parietal | F5-CP3, F6-CP4 | Theta, Alpha, Beta | Attention network, executive control, working memory |
| Inter-hemispheric frontal | F5-F6 | Alpha, Beta | Emotional regulation, cognitive balance |
| Inter-hemispheric central | C3-C4 | Beta, Mu | Motor coordination, sensorimotor integration |
| Inter-hemispheric parietal | CP3-CP4, PO3-PO4 | Alpha, Gamma | Spatial processing, visual integration |
| Frontal-occipital | F5-PO3, F6-PO4 | Theta, Gamma | Top-down attention, visual working memory |
| Central-parietal | C3-CP3, C4-CP4 | Beta, SMR | Sensorimotor processing, motor planning |
The Crown's raw EEG output at 256Hz gives you the temporal resolution needed for reliable phase extraction up to roughly 100 Hz (well into the gamma band, per the Nyquist theorem). The on-device N3 chipset handles the heavy signal processing, but the raw data is also available through JavaScript and Python SDKs for custom phase synchrony computation.
For developers, computing PLV or PLI from the Crown's raw EEG stream is a well-defined signal processing pipeline: bandpass filter, Hilbert transform, phase extraction, and synchrony calculation. Libraries like MNE-Python and BrainFlow include built-in functions for most of these steps. The MCP integration even allows feeding synchrony metrics directly into AI tools like Claude for real-time analysis and interpretation.
The practical implication: what used to require a lab now requires a Crown on your head and some code. You can watch your own frontal-parietal theta synchrony change as you switch from mind-wandering to focused work. You can track how meditation practice shifts your gamma synchrony profile over weeks. You can build applications that respond not just to how much activity is happening in each brain region, but to how those regions are coordinating with each other.
That's a fundamentally different kind of brain data. Power tells you what's active. Phase synchrony tells you what's connected.
The Future of Phase Synchrony: What Comes Next
Phase synchrony research is accelerating on multiple fronts simultaneously.
Real-time neurofeedback based on synchrony. Most neurofeedback systems today train amplitude (make this frequency band bigger or smaller at this electrode). But synchrony-based neurofeedback, where the feedback signal is the phase coupling between two regions rather than the power at one region, is an emerging frontier. Early studies suggest that people can learn to voluntarily increase frontal-parietal theta synchrony, and that doing so improves working memory performance. The challenge has been computational: calculating synchrony in real-time requires more processing power than simple amplitude measurement. But that barrier is falling as on-device processing improves.
Connectivity biomarkers for brain health. Phase synchrony patterns are turning out to be remarkably sensitive biomarkers. Disrupted gamma synchrony appears years before clinical symptoms in Alzheimer's disease. Altered frontal synchrony patterns distinguish different subtypes of depression that respond to different treatments. Traumatic brain injury often disrupts phase synchrony between specific regions even when traditional EEG metrics look normal. The prospect of using wearable EEG to track your connectivity health over time, catching subtle changes before they become symptomatic, is genuinely within reach.
Brain-computer interfaces driven by synchrony. Current consumer BCIs primarily classify mental states based on power spectral features. But phase synchrony features carry additional information that power alone misses. Research groups are showing that adding synchrony-based features to BCI classifiers significantly improves accuracy for tasks like attention detection, emotional state classification, and motor imagery. The next generation of adaptive brain-computer interfaces will almost certainly incorporate phase synchrony as a primary signal.
The Rhythms Between the Rhythms
Here's what stays with me about phase synchrony.
We've spent over a century measuring what happens inside individual brain regions. How active they are, what frequencies they produce, how their power changes during different tasks. And all of that has been valuable. But it's been like studying an orchestra by measuring how loud each instrument is playing.
Phase synchrony is the score. It's the timing relationships between the instruments that turn noise into music. Two violins playing the same notes at the same volume can sound like harmony or cacophony, depending entirely on whether they're playing in time.
Your brain runs on the same principle. The difference between focused attention and distracted rumination isn't necessarily about which regions are active or how strongly they're oscillating. It's about whether their oscillations are timed to work together. The difference between deep sleep and waking consciousness, between remembering and forgetting, between creative insight and cognitive fog, is, at a fundamental level, a difference in timing.
And for the first time, you don't need a research lab to measure it. You can put a device on your head, write some code, and see the timing relationships between your own brain regions in real time. You can watch your frontal-parietal network synchronize as you concentrate. You can observe your inter-hemispheric coupling shift as your mental state changes.
The brain has been playing this timing game since before you were born. Phase synchrony is how we finally learn to listen.

