Neurosity
Open Menu
Guide

Your Brain Syncs With Other Brains

AJ Keller
By AJ Keller, CEO at Neurosity  •  February 2026
When two people share attention, emotion, or conversation, their brainwaves literally synchronize. This interbrain EEG synchrony is measurable, predictable, and reveals something profound about how human connection works at the neural level.
For most of neuroscience's history, the brain was studied in isolation. One person, one scanner, one skull at a time. But a new wave of research is pointing EEG sensors at two or more people simultaneously, and finding that our brains don't just process the world independently. They lock into shared rhythms during conversation, eye contact, and collaborative tasks. The implications are staggering.
Explore the Crown
The brain-computer interface built for developers

Your Brain Doesn't End at Your Skull

Here's something neuroscience textbooks got wrong for a very long time: they treated the brain as a solo instrument.

Every experiment, every scan, every EEG recording followed the same basic template. One person. One skull. One set of electrodes. The implicit assumption was that understanding one brain in isolation would eventually tell us everything we needed to know about how brains work in the real world.

But brains almost never operate in isolation. Right now, as you read this, your brain is doing something it evolved to do over millions of years of social living. It's modeling other minds. It's predicting what other people will say, feel, and do. And when you're actually in the room with someone, talking, collaborating, arguing, or just sitting together in comfortable silence, your brain is doing something even stranger.

It's synchronizing with theirs.

Not metaphorically. Not in a vague, spiritual sense. Your neural oscillations, the actual electrical rhythms firing across your cortex, begin to align with the neural oscillations of the person you're interacting with. The alignment is measurable. It's frequency-specific. And the strength of that alignment predicts how well you communicate, how much you trust each other, and how effectively you work together.

This is interbrain EEG synchrony. And it might be one of the most important discoveries in social neuroscience.

One Brain Is Not Enough: The Birth of Hyperscanning

For most of the 20th century, studying the social brain meant studying one brain at a time and then guessing how two of them might interact. A researcher would show Person A a picture of a face and measure their amygdala response. Then they'd show Person B the same picture. Separately. In separate sessions. Sometimes on separate days.

The problem with this approach is the same problem you'd have if you tried to understand a tennis match by filming each player individually and then watching the tapes side by side. You'd miss everything that makes it a match. The timing. The rhythm. The way one player's movement triggers the other's response in real time.

In the early 2000s, a small group of neuroscientists decided to do something that sounds obvious in retrospect but was genuinely novel: they recorded from two brains at the same time.

The technique is called hyperscanning, and it changed the game. Instead of one EEG cap, you use two (or more). Instead of one person sitting alone in a quiet room, you have two people interacting. Talking, playing a game, making eye contact, or collaborating on a task. And instead of analyzing each brain's data independently, you compute the statistical relationship between the two brain signals.

What they found was remarkable. During certain types of social interaction, the brainwave patterns of two separate people become correlated in ways that can't be explained by coincidence or by shared exposure to the same external stimulus. Their brains start oscillating together. Not identically, not like two clocks ticking at the same rate, but in a coupled, dynamic way where changes in one brain's rhythm predict changes in the other's.

The signal isn't subtle. In some conditions, interbrain synchrony is strong enough to be the single best predictor of whether the people involved felt connected, understood each other, or performed well together.

Hasson's Discovery: The Brain That Mirrors the Story

The most influential research on brain-to-brain coupling came from Uri Hasson's lab at Princeton, and it started with a question that sounds almost too simple: what happens in your brain when you listen to someone tell a story?

Hasson used fMRI to scan a speaker telling a real, unrehearsed story while recording the audio. Then he played that recording to listeners while they were in the scanner. When he compared the brain activity of the speaker and listener, he found something that no one had predicted.

The listener's brain activity mirrored the speaker's. Not just in auditory cortex (you'd expect that, since they're both processing speech). The coupling extended to higher-order areas: the prefrontal cortex, the temporal-parietal junction, the insula. Regions involved in understanding meaning, tracking narrative structure, and modeling the speaker's intentions.

Here's the part that really stopped people in their tracks. The degree of coupling predicted comprehension. Listeners who showed stronger neural coupling with the speaker scored higher on tests of story comprehension afterward. And the very best listeners didn't just mirror the speaker with a delay. Their brains ran slightly ahead, anticipating what the speaker would say next.

Hasson called this neural coupling, and he published the landmark paper in the Proceedings of the National Academy of Sciences in 2010. The implication was profound: successful communication isn't just about sending and receiving information. It's about two brains falling into a shared neural pattern. When that coupling fails, communication breaks down. When it's strong, you get the feeling of truly being understood.

This was fMRI work, but EEG researchers quickly picked up the thread. And because EEG captures the millisecond-by-millisecond dynamics of neural oscillations, it revealed something fMRI couldn't: the synchrony happens at specific frequencies, and different frequencies carry different types of social information.

What Are the Frequency Bands of Social Connection?

When researchers started measuring interbrain EEG synchrony during live social interaction (not just replaying recordings, but actual face-to-face exchanges), a pattern emerged across dozens of studies.

Different frequency bands synchronize during different aspects of social engagement:

Frequency BandRangeSocial Role
Theta4-8 HzShared memory encoding, joint problem-solving, turn-taking coordination
Alpha8-13 HzShared attention, mutual awareness, coordinated inhibition of irrelevant input
Beta13-30 HzMotor coordination, synchronized movement, gestural communication
Gamma30-100 HzShared perceptual binding, emotional resonance, high-level meaning integration
Frequency Band
Theta
Range
4-8 Hz
Social Role
Shared memory encoding, joint problem-solving, turn-taking coordination
Frequency Band
Alpha
Range
8-13 Hz
Social Role
Shared attention, mutual awareness, coordinated inhibition of irrelevant input
Frequency Band
Beta
Range
13-30 Hz
Social Role
Motor coordination, synchronized movement, gestural communication
Frequency Band
Gamma
Range
30-100 Hz
Social Role
Shared perceptual binding, emotional resonance, high-level meaning integration

The most consistently reported finding across interbrain synchrony studies is coupling in the alpha and theta bands during collaborative tasks and conversation. This makes sense if you think about what those frequencies do in a single brain. Alpha reflects attentional gating: which information gets processed and which gets filtered out. When two brains synchronize in alpha, they're essentially filtering the world in the same way at the same time. They're paying attention to the same things.

Theta coupling, meanwhile, tends to spike during moments of coordinated turn-taking, shared problem-solving, and joint memory formation. Think of it as the frequency of collaborative cognition. When two people's theta rhythms lock together, they're not just in the same room. They're in the same cognitive space.

What Synchrony Actually Looks Like in the Data

Interbrain synchrony is measured using statistical techniques that quantify the relationship between two EEG signals. The most common methods are cross-correlation (how similar the signals are over time), phase-locking value (PLV, whether the signals maintain a consistent phase relationship), and wavelet coherence (how correlated the signals are at specific frequencies over time). A PLV of 0 means no consistent phase relationship. A PLV approaching 1 means the two brains are oscillating in near-perfect lockstep at that frequency.

Couples, Classrooms, and Operating Rooms

Once hyperscanning became established, researchers started asking the obvious question: does interbrain synchrony show up everywhere humans interact? The answer, increasingly, is yes. And the contexts where it appears tell us something important about what drives it.

Romantic Partners

Pavel Goldstein's 2018 study at the University of Colorado is one of the most striking. He recorded EEG from romantic couples sitting together. When one partner was in pain (from a controlled heat stimulus on their arm), the other partner held their hand. The result: interbrain synchrony in the alpha band increased significantly during hand-holding, and the strength of that synchrony correlated with the partner's self-reported empathy. More empathic partners showed stronger neural coupling. And here's the kicker: the pain actually decreased when synchrony was highest. The brain-to-brain coupling wasn't just a side effect of togetherness. It was analgesic.

Classrooms

Suzanne Dikker, a neuroscientist at NYU, brought portable EEG headsets into actual high school classrooms. Students wore them during regular lessons over an entire semester. She found that brain-to-brain synchrony between students (and between students and teachers) predicted both engagement and learning outcomes. The classes where students' brains were most synchronized weren't just the ones with the best teacher. They were the ones where students reported the highest social connection with each other. The brain synchrony captured something that surveys and test scores alone missed: the quality of the shared cognitive experience.

Surgical Teams

A 2021 study by Zheng and colleagues used EEG hyperscanning to study surgical teams during simulated operations. When surgeon and assistant were well-coordinated, their brains showed elevated theta and alpha synchrony. When coordination broke down, the synchrony dropped before the behavioral error was even visible. The brain coupling was a leading indicator of team performance.

Musicians

Anyone who's played music with other people has felt this. The moment when a band "locks in" and everyone seems to anticipate each other's moves without thinking. EEG studies of musical ensembles, particularly guitarists, pianists playing duets, and choir singers, consistently show elevated interbrain synchrony, especially in the theta and alpha ranges. The synchrony is higher between musicians who rate their interaction as more enjoyable and more "connected."

The pattern across all these contexts points to a common mechanism. Interbrain synchrony isn't caused by one specific thing. It's driven by the convergence of several factors that all reflect genuine social engagement.

What Actually Drives Two Brains to Sync Up?

Here's where it gets really interesting. Not all social interaction produces interbrain synchrony. Two people sitting in the same room but ignoring each other? No synchrony. Two people watching the same video but not interacting? Minimal synchrony. Two strangers having an awkward first conversation? Some synchrony, but less than friends having the same conversation.

Researchers have identified the key ingredients:

Shared attention. This is the strongest driver. When two people attend to the same thing at the same time, whether it's a conversation, a problem, or even a piece of music, their brains begin to process incoming information in similar temporal patterns. Shared attention creates shared neural timing.

Eye contact. Direct gaze between two people produces measurable increases in interbrain synchrony, particularly in frontal alpha. This finding, replicated across multiple labs, suggests that eye contact isn't just a social signal. It's a neural synchronization mechanism. Your brain literally couples more strongly with another brain when you look into that person's eyes.

Emotional resonance. When people share emotional responses to the same event, their brains synchronize more strongly than when they have divergent emotional reactions. This is likely related to mirror neuron systems and the brain's empathy circuitry. Feeling the same feeling at the same time creates neural coupling.

Cooperative goals. Working toward the same goal, rather than competing, increases interbrain synchrony. This has been shown in economic games, puzzle-solving tasks, and musical performance. Competition can actually decrease synchrony, even when both people are highly engaged.

Familiarity and social bond strength. Couples synchronize more than strangers. Close friends synchronize more than acquaintances. This suggests that interbrain synchrony isn't just about the task. It's about the relationship. Brains that have a history of interacting together develop more efficient coupling mechanisms.

Neurosity Crown
The Crown captures brainwave data at 256Hz across 8 channels. All processing happens on-device. Build with JavaScript or Python SDKs.
Explore the Crown

BCI-to-BCI: When Brain Synchrony Becomes Brain Communication

Everything so far has been about brains that synchronize naturally, through conversation, shared attention, and social bonding. But a different group of researchers asked a more provocative question: what if you closed the loop? What if you used the brainwave data from one person's EEG to directly influence another person's brain?

In 2014, a team led by Rajesh Rao at the University of Washington published a study that made international headlines. Two people sat in separate rooms on different parts of campus, connected only by the internet. Person A (the "sender") wore an EEG cap and looked at a screen showing a simple game. When they wanted to fire a cannon in the game, they imagined moving their right hand. The EEG detected this motor imagery signal, classified it using a brain-computer interface, and sent the command across campus to Person B (the "receiver"), whose hand was positioned near a keyboard with a transcranial magnetic stimulation (TMS) coil aimed at their motor cortex. When the command arrived, the TMS pulse fired, causing Person B's finger to involuntarily press the key.

Person A thought about moving their hand. Person B's hand moved. Across the internet. Without any voluntary action from Person B.

This was, by any reasonable definition, brain-to-brain communication.

The BrainNet Experiment

In 2019, the same University of Washington group took it further with BrainNet, a system connecting three people in a brain-to-brain network. Two "senders" could see a Tetris-like game and sent EEG-detected decisions (rotate or don't rotate) to a "receiver" who couldn't see the full game board. The receiver integrated the two senders' brain signals (delivered via TMS-evoked phosphenes, flashes of light perceived without actual light) and made a final decision. The three-person network achieved correct decisions about 81% of the time. It was slow, crude, and required bulky equipment. But it was a genuine multi-brain network making collaborative decisions through neural signals alone.

The technology behind these experiments is still primitive compared to natural neural coupling. The bandwidth is laughably low. Where natural interbrain synchrony involves the continuous coupling of complex oscillatory patterns across multiple frequency bands, current BCI-to-BCI systems transmit roughly one bit of information every few seconds. It's the difference between a high-speed fiber optic connection and two people shouting across a canyon.

But the principle is proven. EEG can read a neural intention from one brain. That intention can be transmitted digitally. And it can be delivered to another brain in a way that produces a measurable response. The gap between "primitive proof of concept" and "useful technology" is an engineering problem, not a physics problem.

The "I Had No Idea" Fact About Interbrain Synchrony

Here's the thing about interbrain synchrony research that most people don't know, and that genuinely surprised me when I first encountered it.

Interbrain synchrony predicts social outcomes better than any behavioral measure.

Think about what we normally use to measure social connection. Surveys. Questionnaires. Behavioral coding of facial expressions. Self-reports of empathy. These are all useful tools, but they're all indirect. They capture what people say they feel or what an outside observer thinks they see.

Interbrain synchrony captures something else entirely. It measures the degree to which two nervous systems are actually coupled at the level of neural oscillations. And in study after study, this neural measure outperforms behavioral measures at predicting things like: how well a team will perform on a collaborative task, how much a student will learn in a classroom, how empathic a partner actually is (not just how empathic they claim to be), and how well a therapist and client are connected during a session.

This suggests that our brains know something about social connection that our conscious self-reports don't capture. You can feel disconnected from someone while your brains are highly synchronized, or feel connected while your neural coupling is actually low. The brain-to-brain measure doesn't replace subjective experience, but it adds a layer of data that subjective reports alone can't provide.

Two Crowns, One Experiment

The bulk of interbrain synchrony research has been conducted in university labs with research-grade EEG systems, gel electrodes, and controlled experimental conditions. But the field is moving rapidly toward more naturalistic settings. Dikker's classroom studies used portable EEG headsets. Goldstein's pain study used consumer-friendly setups. The direction is clear: interbrain synchrony research needs to escape the lab if it's going to tell us anything about how brains actually synchronize in real life.

This is where consumer-grade EEG becomes genuinely important for the science.

The Neurosity Crown captures 8 channels of EEG at 256Hz, covering frontal, central, and parietal regions across both hemispheres. It uses dry electrodes (no gel, no prep), processes data on-device through the N3 chipset, and exposes raw EEG data through open SDKs in JavaScript and Python. It also integrates with BrainFlow and Lab Streaming Layer (LSL), the standard tools used by researchers for synchronized multi-device recording.

Put a Crown on two people. Stream their EEG data simultaneously through LSL with synchronized timestamps. Compute cross-correlation or phase-locking values between corresponding channels. You now have a real-time measure of interbrain synchrony that works outside a lab, requires no gel or conductive paste, and can be set up in minutes rather than hours.

This matters because the biggest limitation of hyperscanning research right now isn't the analysis. It's the logistics. Getting two people into a lab, applying gel electrodes to both of them, running a controlled protocol, and doing it all within the same session is expensive, time-consuming, and fundamentally artificial. The interactions being studied are shaped by the weirdness of the laboratory setting itself.

A pair of Crowns changes the equation. You can measure interbrain synchrony during an actual conversation at a coffee shop. During a real team meeting. During a therapy session. During a couple's argument at their kitchen table. The science becomes ecological. And ecological data is where the real discoveries happen.

FactorLab EEG HyperscanningDual Crown Setup
Setup time30-60 minutes (gel application)Under 5 minutes (dry electrodes)
EnvironmentControlled lab onlyAny natural setting
Data accessProprietary software, offline analysisOpen SDK, real-time streaming
SynchronizationLab equipment with TTL triggersLSL network timestamps
Participant comfortLimited by gel, wires, immobilityWireless, wearable, 228 grams
Cost per pair$20,000-$100,000+ for research systemsTwo Crown devices
Developer accessTypically noneJavaScript and Python SDKs
Factor
Setup time
Lab EEG Hyperscanning
30-60 minutes (gel application)
Dual Crown Setup
Under 5 minutes (dry electrodes)
Factor
Environment
Lab EEG Hyperscanning
Controlled lab only
Dual Crown Setup
Any natural setting
Factor
Data access
Lab EEG Hyperscanning
Proprietary software, offline analysis
Dual Crown Setup
Open SDK, real-time streaming
Factor
Synchronization
Lab EEG Hyperscanning
Lab equipment with TTL triggers
Dual Crown Setup
LSL network timestamps
Factor
Participant comfort
Lab EEG Hyperscanning
Limited by gel, wires, immobility
Dual Crown Setup
Wireless, wearable, 228 grams
Factor
Cost per pair
Lab EEG Hyperscanning
$20,000-$100,000+ for research systems
Dual Crown Setup
Two Crown devices
Factor
Developer access
Lab EEG Hyperscanning
Typically none

What Comes Next: From Measurement to Feedback

The current state of interbrain synchrony research is mostly observational. Record from two brains. Compute the coupling. Analyze it after the fact. But the next frontier, and the one that's most exciting, is real-time interbrain neurofeedback.

Imagine two people in a conversation getting a subtle signal, a change in ambient lighting, a gentle tone, a visual cue, that indicates when their brains are highly synchronized and when they're drifting apart. Couples therapy sessions where the therapist can see, in real time, when partners are neurally attuned and when they've disconnected. Team training where group synchrony is displayed as a live metric, the way heart rate is displayed on a treadmill.

Early studies are already exploring this. Duan et al. (2021) showed that providing real-time feedback about interbrain synchrony to pairs of participants increased their cooperative behavior and self-reported connection. The synchrony wasn't just a passive readout. When people could see it and try to increase it, they actually did. And the interpersonal benefits were real.

This creates a feedback loop that didn't exist before: your brain state affects the other person's brain state, which feeds back to affect yours, and now both of you can see the coupling and intentionally modulate it. It's neurofeedback, but social. It's not just training your own brain. It's training the relationship between two brains.

The Quiet Implication

Here's what sits with me about this research.

For all of human history, social connection has been invisible. You could feel it, sometimes. You could observe it in behavior. But you couldn't measure the thing itself, the actual neural coupling between two people engaged in a shared moment.

Now you can. And what the measurements reveal is that connection isn't just a metaphor. When you feel "in sync" with someone, your brains are literally oscillating in synchrony. When you feel disconnected, the coupling has dropped. When a teacher loses the classroom, you can see it in the neural data before a single student raises their hand or checks their phone.

This doesn't reduce human connection to mere brainwaves. If anything, it deepens the mystery. Because the question isn't just "do brains synchronize?" We know they do. The question is: what is it about consciousness, about the experience of being a person with another person, that makes two separate nervous systems start beating in time?

We don't have the full answer yet. But we have the tools to start looking. And looking together, it turns out, is exactly how our brains were designed to work.

Stay in the loop with Neurosity, neuroscience and BCI
Get more articles like this one, plus updates on neurotechnology, delivered to your inbox.
Frequently Asked Questions
What is interbrain EEG synchrony?
Interbrain EEG synchrony is the measurable alignment of brainwave oscillations between two or more people during social interaction. When people share attention, engage in conversation, or collaborate on a task, their brain rhythms in specific frequency bands (particularly alpha, theta, and gamma) begin to fluctuate in lockstep. This synchrony is detected by simultaneously recording EEG from multiple people and computing statistical coupling between their signals.
What causes brainwaves to synchronize between people?
Several factors drive interbrain synchrony: shared attention to the same stimulus, emotional resonance and empathy, turn-taking in conversation, cooperative goals, physical proximity, and eye contact. The common thread is joint engagement. When two people are genuinely attending to the same thing or to each other, their brains produce correlated oscillatory patterns. People with stronger social bonds, such as romantic partners or close teammates, tend to show higher synchrony.
Can you measure brain-to-brain synchrony with consumer EEG?
Yes. Interbrain synchrony research has been conducted with consumer-grade EEG devices. An 8-channel EEG headset like the Neurosity Crown, sampling at 256Hz, captures the frequency bands (theta, alpha, beta, gamma) where interbrain coupling is most commonly measured. By recording from two Crown devices simultaneously and computing cross-correlation or phase-locking values between corresponding channels, researchers and developers can quantify interbrain synchrony outside the lab.
Is brain-to-brain communication real?
Brain-to-brain synchrony during interaction is well-established science, documented in dozens of peer-reviewed studies. Brains genuinely align their oscillatory patterns during communication. Direct brain-to-brain information transfer (sending a thought from one brain to another) has also been demonstrated in proof-of-concept BCI experiments, where one person's EEG-detected intention was transmitted via the internet and delivered as a TMS pulse to another person's motor cortex. These experiments are early-stage but scientifically real.
What is neural coupling in conversation?
Neural coupling, described by Princeton neuroscientist Uri Hasson, is the phenomenon where a listener's brain activity begins to mirror the speaker's brain activity during successful communication. Using fMRI, Hasson showed that the listener's brain doesn't just respond to the speaker with a delay. In cases of high comprehension, the listener's brain activity actually anticipates the speaker's, running slightly ahead in time. This predictive coupling correlates with how well the listener understands the story being told.
What are the applications of interbrain synchrony research?
Interbrain synchrony has practical applications in education (measuring student-teacher engagement in real time), team performance (quantifying collaboration quality), couples therapy (tracking emotional attunement), music and performance arts (studying ensemble coordination), and BCI development (building brain-to-brain communication systems). It also has implications for understanding autism, social anxiety, and other conditions that affect social connection.
Copyright © 2026 Neurosity, Inc. All rights reserved.