EEG in 1924 vs. 2026
One Hundred and Two Years Ago, a Man Taped Electrodes to His Son's Head
The year is 1924. Calvin Coolidge is president. The first winter Olympics are happening in Chamonix. Most homes in America don't have a telephone yet.
And in a university clinic in Jena, Germany, a psychiatrist named Hans Berger is doing something that nobody on Earth has ever done before. He's pressing silver foil electrodes against his teenage son Klaus's scalp, wiring them to a Siemens double-coil string galvanometer the size of a sewing machine, and recording, onto a slowly scrolling strip of photographic paper, the electrical activity of a living human brain.
The setup fills a table. The recording medium is literally paper. There is one channel. The galvanometer's quartz fiber is so delicate that someone walking too heavily in the hallway could ruin the reading. Berger has the curtains drawn because he doesn't want anyone to see what he's doing. He's been secretly pursuing this experiment for years, terrified that his colleagues will think he's a crank.
Now imagine this. It's 2026. You're sitting at your desk. You pick up a device that weighs 228 grams, less than a can of soup, and place it on your head like a pair of headphones. Eight dry electrodes settle against your scalp. Within seconds, a custom chipset is sampling your brain's electrical activity 256 times per second across all 8 channels, processing the data locally, and streaming it via Bluetooth to your laptop, where a JavaScript application you wrote yourself is reading your brainwaves in real-time and feeding them into an AI model.
Same phenomenon. Same electrical signals propagating through the same cerebrospinal fluid, skull, and scalp tissue. Same physics.
But everything else? Everything else is different.
The gap between those two moments is 102 years, and what happened in between is one of the most dramatic transformations in the history of scientific instrumentation. This is the story of how EEG went from a secret experiment in a drawn-curtain laboratory to something you can do in your living room while drinking coffee. And each step in that transformation tells you something important about where brain-computer interfaces are going next.
Start With What Berger Actually Had to Work With
To really appreciate how far EEG has come, you need to understand the constraints Berger was operating under. Not as abstract facts, but viscerally. Because the gap between 1924 and 2026 isn't just "technology got better." It's a series of specific bottlenecks that got removed, one by one, each removal unlocking possibilities that the previous generation couldn't have conceived of.
Berger's recording instrument was a string galvanometer. Here's how it worked: a thin quartz fiber was suspended between the poles of a magnet. Electrical current from the electrodes passed through the fiber, causing it to deflect. A beam of light reflected off the fiber and fell onto a moving strip of photographic paper, tracing a wavy line that represented the voltage changes at the scalp.
That's it. That was the technology. A magnet, a quartz thread, a beam of light, and a roll of paper.
The sensitivity was poor. Berger's galvanometer could detect signals in the range of roughly 50 to 100 microvolts, which is right at the edge of what scalp EEG actually produces. The noise floor was terrible. Every vibration in the building, every fluctuation in the power supply, every tiny air current could contaminate the recording. There was no filtering except what the physical properties of the instrument provided. There was no amplification beyond the galvanometer itself.
And there was one channel. One pair of electrodes. One location on the head. Berger could see that the brain was producing electrical oscillations, and he could identify that these oscillations changed depending on whether the subject was relaxed or alert. But he was looking at the brain's activity through a keyhole.
1924 (Berger's Setup):
- Channels: 1
- Electrodes: Silver foil, manually applied with rubber bandage
- Sampling: ~50-75Hz equivalent (analog, limited by galvanometer response)
- Recording medium: Photographic paper
- Weight: 20+ kg (galvanometer alone)
- Size: Filled a laboratory table
- Processing: None (visual inspection by Berger himself)
- Data storage: Paper rolls, stored in cabinets
- Cost: Custom-built, university-funded
- Who could use it: One psychiatrist in Jena, Germany
2026 (Neurosity Crown):
- Channels: 8
- Electrodes: Dry flexible rubber (no gel, no technician)
- Sampling: 256Hz per channel (digital)
- Recording medium: On-device N3 chipset + Bluetooth streaming
- Weight: 228 grams
- Size: Fits on your head like headphones
- Processing: Real-time on-device AI, frequency decomposition, signal quality analysis
- Data storage: Digital, unlimited, cloud-optional
- Cost: Consumer price point
- Who could use it: Anyone
Those numbers are interesting on their own. But numbers don't tell stories. Let's go through the comparisons that actually matter, one by one, because each one represents a different kind of breakthrough.
The Channel Problem: From Keyhole to Panorama
Berger's single channel was like watching a football game through a paper towel tube. You could tell something was happening. You could see movement. But you had no idea what was going on across the rest of the field.
This matters because your brain doesn't do one thing at a time in one place. When you focus your attention, your frontal cortex, parietal cortex, and occipital cortex are all doing different things simultaneously, and it's the coordination between them that constitutes focused attention. A single electrode over one region captures the local story but misses the conversation between regions entirely.
The 10-20 electrode placement system, standardized by Herbert Jasper in 1958, was the first systematic attempt to solve this. Clinical EEG eventually settled on 19 to 21 channels as the standard for diagnostic recordings. Research labs pushed further, to 64, 128, even 256 channels.
But here's the thing that most people don't realize about channel count: more isn't always better. What matters is covering the right regions. The Neurosity Crown's 8 channels at positions CP3, C3, F5, PO3, PO4, F6, C4, and CP4 span all four lobes of the cortex. That means it captures frontal activity (where your executive control and decision-making live), central activity (motor planning and somatosensory processing), and parietal-occipital activity (spatial awareness, visual processing, and those alpha rhythms Berger first identified).
Eight channels won't give you the spatial resolution of a 256-channel research cap. But they give you something Berger never had: the ability to see how different brain regions are talking to each other. Coherence between frontal and parietal channels. Asymmetry between left and right hemispheres. Cross-regional frequency patterns that correlate with focus, relaxation, and cognitive load.
That's the leap from a keyhole to a panorama.
The Electrode Problem: From Silver Foil and Rubber Bandages to Five-Second Setup
Berger's electrodes were strips of silver foil wrapped around the head and held in place with rubber bandages. Later experimenters moved to metal discs attached with collodion (a type of adhesive) and conductive paste. By the mid-20th century, the standard had become small metal cups filled with conductive gel, carefully applied to specific scalp locations by a trained technician.
This was the setup for roughly 80 years. And it was a significant barrier. Getting a clinical EEG meant sitting still while someone squeezed gel into small cups and pressed them onto 19+ locations on your head. The process took 20 to 45 minutes. Your hair was a mess afterward. The gel dried out over time, degrading signal quality. And if an electrode shifted or lost contact, the technician had to fix it.
Think about what this meant for neurofeedback, the technique Joe Kamiya pioneered in the 1960s. Even if you wanted to train your brain using real-time EEG feedback, the logistics of getting electrodes on your head made it impossible to do casually. Every session was a production.
Dry electrode technology changed everything. Instead of metal cups filled with conductive gel, the Crown uses flexible rubber electrodes that make contact with the scalp through the hair. No gel. No prep. No technician. You put it on your head. It works. The whole process takes about five seconds.
Dry electrodes sound simple in concept, but they took decades to become practical. The challenge is impedance. Gel-based electrodes achieve low skin-electrode impedance (typically under 5 kilohms) because the gel fills the microscopic gaps between the electrode and the skin, creating a smooth conductive path. Dry electrodes sit on top of the skin without that intermediary, resulting in higher impedance and more noise. The Crown's flexible rubber electrodes are designed to conform to scalp contours and maintain consistent contact pressure, bringing impedance low enough for clean signal acquisition. Combined with on-device signal processing that filters artifacts in real-time, the result is research-quality data without the gel.
The Speed Problem: From Analog Squiggles to 256 Digital Snapshots Per Second
Berger's galvanometer had a frequency response that topped out at roughly 50-75Hz. That is, if the brain was producing electrical oscillations faster than about 75 cycles per second, the instrument physically couldn't keep up. The quartz fiber couldn't vibrate fast enough.
This wasn't a huge problem for Berger, because the signals he cared about (alpha at 8-13Hz, beta at 13-30Hz) were well within that range. But it meant that an entire class of brain activity was invisible to him.
gamma brainwaves, the fast oscillations between 30 and 100Hz, are now known to be critical for higher cognitive functions. Gamma activity is associated with conscious perception, memory formation, attention binding, and what researchers call the "spotlight of consciousness." When you have an "aha" moment, gamma waves surge. When experienced meditators enter deep states of awareness, their brains produce sustained gamma activity that you simply don't see in novice meditators.
Berger couldn't see any of that. His instrument literally couldn't detect it.
The Crown samples at 256Hz across all channels. By the Nyquist theorem, this means it can accurately capture brain activity up to 128Hz, well into the gamma range and beyond. Every frequency band that neuroscience has identified as cognitively relevant, from slow delta brainwaves (0.5-4Hz) through theta (4-8Hz), alpha (8-13Hz), beta (13-30Hz), and gamma (30-100Hz+), is within range.
This isn't just a quantitative improvement. It's qualitatively different. Berger could see your brain idling and your brain working. The Crown can see your brain idling, working, creating, remembering, mediating, focusing, struggling, and flowing, because each of those states has a different signature across the full frequency spectrum.

The Recording Problem: From Paper You Can't Search to Data You Can Compute
Here's a detail about Berger's work that stopped me cold when I first learned it.
He analyzed his recordings by hand. Literally. He would unroll the photographic paper, measure the distance between wave peaks with a ruler, count the oscillations, and calculate the frequency manually. Every single recording. For years.
There was no other option. The data existed as a physical object, a strip of paper with wavy lines on it. You couldn't search it. You couldn't filter it. You couldn't run statistics on it. You couldn't compare it algorithmically to another recording. You could only look at it with your eyes and measure it with your hands.
The transition from analog paper recording to digital recording, which began in the 1970s and was largely complete by the 1990s, was arguably the single most important technological leap in EEG's history. Not because digital recording looked better (early digital systems actually had lower resolution than good analog equipment). But because it turned EEG from pictures into data.
Once brainwaves are numbers in a computer, you can do things with them that are impossible with paper. Fourier transforms that decompose a complex signal into its constituent frequencies. Statistical comparisons across thousands of recordings. Source localization algorithms that estimate where in the brain a signal originated. Machine learning classifiers that can distinguish between cognitive states with 90%+ accuracy.
The Crown's N3 chipset runs these analyses on the device itself, in real-time. Frequency decomposition, signal quality metrics, artifact rejection, focus and calm scoring. All happening locally, 256 times per second, before the data ever leaves the device. Berger spent hours with a ruler. The N3 chipset does in milliseconds what would have taken him weeks.
And here's the part that would have truly blown Berger's mind: through the Neurosity MCP, that processed brain data can flow directly into AI systems like Claude and ChatGPT. Your brainwave patterns, captured and analyzed in real-time, becoming input for an artificial intelligence. In 1924, the data went from brain to paper to cabinet drawer. In 2026, it goes from brain to chipset to AI.
The Size and Weight Problem: From Laboratory to Everywhere
This comparison is almost absurd, but it's worth dwelling on because it represents the single most visible change in a century of EEG.
Berger's galvanometer was a precision laboratory instrument that sat on its own table. Add the power supply, the photographic recording drum, the electrode leads, and the light source, and you had a setup that dominated a room. Moving it was a multi-person job. Using it outside the laboratory was unthinkable.
Clinical EEG equipment got smaller over the decades but remained fundamentally non-portable through the 1990s. A typical EEG machine was a wheeled cart with an amplifier unit, a monitor, a printer, and a tangle of electrode cables. Ambulatory EEG systems (portable recorders that patients could wear for 24-72 hours) arrived in the 1980s but were still bulky, clipped-to-your-belt affairs with wired electrode caps.
The Crown weighs 228 grams. That's about 0.5 pounds. Less than a can of soup. You can throw it in a backpack and do EEG on a park bench, on an airplane, in your kitchen at 2am because you had an idea for a neurofeedback experiment and couldn't sleep until you tried it.
This sounds like a convenience feature. It's actually a paradigm-level shift. When EEG is trapped in a lab, you can only study the brain in a lab. When EEG is portable, you can study the brain in the wild: during real work, real conversations, real creative sessions, real moments of stress and flow and boredom. The data you get from a brain that's actually living its life is fundamentally different from the data you get from a brain that's sitting in a hospital room trying to hold still.
| Feature | 1924 (Berger) | 2026 (Crown) | What Changed |
|---|---|---|---|
| Channels | 1 | 8 | From keyhole to panorama. Cross-regional brain dynamics become visible. |
| Sampling rate | ~50-75Hz (analog) | 256Hz (digital) | Full frequency spectrum captured, including gamma waves tied to cognition. |
| Electrodes | Silver foil + rubber bandage | Dry flexible rubber | From 30-minute gel prep to 5-second setup. No technician required. |
| Size / weight | Room-sized, 20+ kg | Headband, 228g | From lab-locked to wear-anywhere. Real-world brain data becomes possible. |
| Recording medium | Photographic paper | On-device N3 chipset | Data goes from physical artifact to computable numbers in milliseconds. |
| Processing | Manual (ruler and eyes) | Real-time on-device AI | Analysis went from hours-per-recording to milliseconds, done automatically. |
| Data analysis | Visual inspection only | FFT, ML classifiers, focus/calm scores | From 'I see wavy lines' to 'Your frontal theta/beta ratio is 2.3.' |
| Connectivity | None (standalone) | Bluetooth, MCP for AI | Brain data can now talk to software, apps, and AI models in real-time. |
| Accessibility | 1 psychiatrist in Jena | Anyone with the device | Democratized from exclusive research tool to consumer product. |
| Applications | Basic research on brain rhythms | Neurofeedback, BCI, focus tracking, AI integration, developer SDK | From proving the brain is electric to building on that electricity. |
The Processing Story: Your Brain Data Finally Got a Brain of Its Own
This is the comparison that I think best captures the philosophical distance between 1924 and 2026.
In Berger's time, the "intelligence" applied to EEG data was entirely human. Berger looked at paper, recognized patterns, and drew conclusions. This was true for every EEG practitioner for decades. A neurologist reading an EEG in the 1960s was doing essentially the same thing Berger did in the 1920s: eyeballing squiggly lines and using pattern recognition honed by years of training.
Quantitative EEG (qEEG) in the 1980s began to automate some of this work. Computers could perform Fourier transforms, turning raw time-series data into frequency-domain information (how much alpha, beta, theta, and delta activity is present at each electrode). Color-coded brain maps let clinicians see spatial patterns at a glance. But the data still had to be sent to a separate computer for processing. Record first, analyze later.
The Crown's N3 chipset collapses that pipeline. It records and processes simultaneously, on the device itself. Real-time frequency decomposition. Real-time artifact detection and rejection. Real-time calculation of focus and calm metrics derived from cross-channel, cross-frequency analysis. The intelligence isn't in a remote computer or in a human's pattern-recognition skills. It's in the hardware on your head.
And then there's the AI layer. Through open SDKs in JavaScript and Python, and through the MCP integration, the Crown's processed data becomes input for external intelligence systems. Your brainwave patterns, after being captured and analyzed by the N3 chipset, can inform how an AI assistant responds to you. Can shape what music plays in your ears. Can trigger automations in your development environment when your focus drops.
Berger's EEG was a one-way mirror. You could look at the brain, but the brain couldn't look back. The Crown is a two-way conversation. The device reads your brain, and your environment responds.
The Cost and Accessibility Revolution
Let's talk about who could actually use this technology at each point in its history, because the access story is almost as dramatic as the technology story.
In 1924, EEG existed in exactly one place on Earth: Hans Berger's laboratory in Jena. The equipment was custom-built from components that only a university-funded researcher could obtain. The knowledge of how to use it existed in exactly one person's head.
By the 1950s, EEG was standard in major hospitals. But "standard" still meant expensive. A clinical EEG system cost the equivalent of tens of thousands of dollars. Operating it required a trained EEG technician. Interpreting the results required a neurologist with years of specialized training. Getting an EEG as a patient meant having a medical reason, a doctor's referral, and access to a facility that had the equipment.
By the 2000s, research-grade EEG systems had come down in price but were still in the range of $10,000 to $100,000+. They required gel electrodes, quiet laboratory environments, and technical expertise.
The consumer EEG market that emerged in the 2010s was the first time anyone without a medical degree or a research grant could access their own brain's electrical activity. But early consumer devices with 1-2 channels were limited in what they could actually show you.
The Crown occupies a genuinely new category. It's a consumer device with research-grade capabilities. Eight channels spanning all cortical lobes, 256Hz sampling, on-device processing, and, critically, open SDKs that let you build anything you want with the data. You don't need a neurology degree to use it. You don't need a research grant to buy it. And you don't need permission from anyone to build applications with your own brain data.
That last point matters more than it might seem. For most of EEG's history, your brainwave data belonged to the institution that recorded it. The hospital owned your EEG. The research lab owned your data. The idea that a person would have direct, programmable access to their own brain's electrical output, that you could write a Python script to analyze your own neural patterns, would have been incomprehensible to every EEG researcher from 1924 to roughly 2012.
The "I Had No Idea" Moment: What the Numbers Actually Mean
Here's a fact that puts the 1924-to-2026 comparison into a perspective that genuinely startled me when I worked it out.
In one second, at 256Hz across 8 channels, the Crown generates 2,048 data points of brain electrical activity.
Hans Berger, working with his single-channel galvanometer and photographic paper, produced roughly 50-75 data points per second (and those were analog, smeared, and noisy).
That means in one minute, the Crown produces approximately 122,880 clean, digital data points. Berger produced maybe 4,500 noisy analog squiggles.
In an hour-long recording session, the Crown generates about 7.4 million data points. A comparable session for Berger would have yielded around 270,000 data points on paper, none of which could be computationally analyzed.
But here's the kicker. The Crown doesn't just collect those 7.4 million data points. It processes them in real-time, on-device, performing frequency decomposition, artifact rejection, and cognitive state classification, and it streams the results wirelessly to any application a developer wants to build. Berger would then spend days or weeks manually measuring paper tracings with a ruler.
The throughput gap isn't a percentage increase. It's not even an order of magnitude. It's a transformation so complete that the two activities, Berger's paper-and-galvanometer recordings and a Crown session streaming into an AI model, are barely recognizable as the same technology. The physics is the same. Ions crossing neural membranes create voltage differentials that propagate to the scalp. But everything between the physics and the insight has been reinvented.
1924: Brain to Paper to Cabinet Neurons fire. Voltage appears at scalp. Silver foil electrode picks up signal. Signal travels through wire to galvanometer. Quartz fiber deflects. Light beam traces line on moving photographic paper. Paper is developed. Paper is measured with a ruler. Measurements are recorded in a notebook. Paper roll goes into a cabinet.
Time from brain activity to insight: Days to weeks.
2026: Brain to Chipset to AI Neurons fire. Voltage appears at scalp. Dry electrode picks up signal. N3 chipset digitizes at 256Hz. On-device processing performs real-time FFT, artifact rejection, and state classification. Processed data streams via Bluetooth. JavaScript or Python SDK receives data. Application renders neurofeedback, adjusts music, or sends brain state to AI model via MCP.
Time from brain activity to insight: Milliseconds.
What Berger Was Really After (And Whether We've Found It)
Here's the part of this story that I think most comparisons miss, because they focus on the hardware and skip the human.
Hans Berger didn't invent EEG because he wanted to build a medical device. He invented it because, in 1893, his sister somehow knew he was in danger from hundreds of miles away, and he spent the rest of his life trying to find the physical mechanism behind that experience. He was looking for what he called "psychic energy," a measurable physical correlate of mental states.
He never found telepathy. But he found something arguably more important: proof that the brain's internal states produce external, measurable signals. That what's happening inside your skull isn't locked in there. It leaks out. And if it leaks out, it can be read.
That insight is the through-line that connects everything in this comparison. Every improvement from 1924 to 2026, more channels, higher sampling rates, better electrodes, digital processing, wireless connectivity, AI integration, is an answer to the same question Berger was asking: can we read what the brain is doing?
The answer in 1924 was: barely. One channel, paper recording, days of manual analysis to extract basic frequency information.
The answer in 2026 is: yes, and we can act on it in real-time.
The Neurosity Crown doesn't give you telepathy either. Nobody's sending thoughts across a room. But it does give you something that would have fulfilled the spirit of Berger's obsession, if not the letter. It gives you a window into your own brain's electrical activity, processed and interpreted in real-time, integrated with software that can respond to your cognitive state as it changes from moment to moment.
Berger wanted to externalize the internal. He wanted to make the invisible visible. He spent 30 years chasing that goal with silver foil, rubber bandages, and a light beam bouncing off a quartz thread.
One hundred and two years later, you can do it with a device that fits in your backpack and an npm install.
Where the Next 100 Years Goes
Let me leave you with a thought that keeps nagging at me.
Look at the comparison table above. Look at the trajectory. From 1 channel to 8. From analog paper to an AI-connected chipset. From one man in a closed room to anyone with a credit card and a code editor.
Now project that curve forward. Not a hundred years. Just ten.
If the last decade took us from clunky 1-2 channel consumer devices to the Crown's 8-channel, AI-integrated, SDK-equipped, 228-gram brain computer, where does the next decade go? What happens when the channel count doubles again? When the processing gets another order of magnitude faster? When the AI models that receive your brain data are ten or a hundred times more capable than what we have today?
Hans Berger drew the curtains because he was afraid his colleagues would laugh at him. He thought he was doing something embarrassing, looking for telepathy with a quartz thread and a strip of paper.
He was actually inventing the first rung of a ladder whose top we still can't see.
And the question that should keep you up at night isn't how far EEG has come since 1924. It's how far it's going to go from here. Because if the first hundred years took us from a paper tracing to an AI-connected brain computer, the next ten are going to be very, very interesting. And for the first time in the history of this technology, you don't have to be a psychiatrist in a German university to be part of it.
You just have to put it on your head.

