Neurosity
Open Menu
Guide

The Myth of Multitasking

AJ Keller
By AJ Keller, CEO at Neurosity  •  January 2026
Your brain can't multitask. What feels like doing two things at once is actually rapid switching between tasks, and each switch costs you time, accuracy, and IQ points.
Decades of neuroscience research reveal that the human brain has a fundamental bottleneck in its attention system. Only about 2.5% of people are genuine 'supertaskers' who can handle two complex tasks simultaneously. For the other 97.5% of us, multitasking is a productivity illusion that makes us slower, dumber, and more stressed.
Explore the Crown
8-channel EEG with JavaScript and Python SDKs

You're Not Doing Two Things at Once. You Never Were.

Right now, as you read this sentence, there's a good chance you have at least six other tabs open. Maybe Slack is pinging in the background. Maybe your phone just buzzed with a notification you're trying very hard to ignore. Maybe you're "listening" to a podcast at the same time.

And you probably believe, on some level, that you're handling all of it. That your brain is running multiple programs simultaneously, like a computer with enough RAM to keep everything humming.

Here's the problem: your brain is not a computer. And the thing you call multitasking? It isn't happening. Not even a little bit.

What your brain is actually doing is something much more interesting, and much more costly, than parallel processing. It's switching. Frantically. Hundreds of times a day. And every single switch comes with a tax that you never see on the invoice but always pay in full.

The myth of multitasking is one of the most persistent and damaging illusions of modern knowledge work. And neuroscience hasn't just debunked it. Neuroscience has shown us exactly why it fails, where it fails in the brain, and what it costs you in IQ points, time, and cognitive performance. The numbers are worse than you think.

The Bottleneck Inside Your Skull

To understand why multitasking is impossible for the human brain, you need to understand something about the architecture of attention. And the simplest way to put it is this: your brain has a bouncer, and the bouncer only lets one complex thought through the door at a time.

Neuroscientists call this the central bottleneck theory, and it's one of the most replicated findings in cognitive psychology. The idea, first proposed by Harold Pashler in the early 1990s and confirmed by hundreds of experiments since, is that certain stages of cognitive processing can only handle one task at a time. Specifically, the stage where you select a response to a stimulus, where your brain decides "given this input, what should I do?" That stage is strictly serial. One thing at a time. No exceptions.

Think of it like a single-lane bridge. Traffic can approach from both directions, but only one car can cross at a time. The other has to wait. And the wait isn't free.

Pashler demonstrated this with a beautifully simple experiment. He gave people two tasks to perform nearly simultaneously: respond to a sound with one hand, and respond to a visual signal with the other. When the two signals arrived at the same time, people couldn't respond to both at once. The second response was always delayed. Always. No matter how much they practiced.

This delay is called the psychological refractory period, and it reveals something fundamental about your brain's wiring. The prefrontal cortex, that thin strip of neural tissue behind your forehead that handles executive function, planning, and decision-making, has a structural limitation. It can configure itself for one task set at a time. When you ask it to handle two, it doesn't split in half. It queues.

What Actually Happens When You "Multitask"

So if you're not doing two things at once, what are you doing when you toggle between your email and your code editor, or between a Zoom call and a spreadsheet?

You're task-switching. And cognitive neuroscientists have mapped what happens during a task-switch with remarkable precision.

When you switch from Task A to Task B, your brain performs a two-part operation:

Goal shifting. Your prefrontal cortex deactivates the mental rule set for Task A ("I am writing prose, I need to think about sentence structure and narrative flow") and activates the rule set for Task B ("I am reading an email, I need to parse the request and formulate a response"). This involves the anterior prefrontal cortex and the posterior parietal cortex working together to reconfigure your attentional priorities.

Rule activation. Your brain must suppress the procedural rules for the previous task and load the rules for the new one. If you were writing code, the syntax rules, variable names, and logical structure of your program were held in working memory. Now your brain has to flush those out and load the context of whatever you're switching to.

Each of these operations takes time. Sometimes a few tenths of a second. Sometimes several seconds, depending on the complexity of the tasks involved. And here's what makes this insidious: you don't notice the cost. The switch feels instantaneous because you don't have conscious access to the loading process. It happens below the surface, like a hard drive spinning up while you stare at a progress bar you can't see.

But the cost is measurable. And it's enormous.

The Real Cost of Task-Switching

Research by Joshua Rubinstein, David Meyer, and Jeffrey Evans found that task-switching can consume up to 40% of someone's productive time. That's not a typo. If you spend your entire workday switching between tasks every few minutes, you could be losing nearly half your productive capacity to the invisible tax of reconfiguration.

The effect gets worse as task complexity increases. Switching between simple, well-practiced tasks (like walking and talking) incurs minimal cost. Switching between two complex cognitive tasks (like writing and analyzing data) can cost you 50% more time than doing them sequentially.

Your Brain on Interruption: Dumber Than You'd Believe

Here's the "I had no idea" moment that might change how you structure your entire workday.

In 2005, a research team at the Institute of Psychiatry at King's College London, commissioned by Hewlett-Packard, conducted a study on the cognitive effects of constant digital interruption and attempted multitasking. They tested participants' IQ under three conditions: focused work, work while multitasking with email and phone, and (for comparison) the known IQ effects of other cognitive impairments.

The results were startling. Workers who multitasked with electronic media showed an average IQ drop of 10 points.

To put that in perspective:

ConditionIQ ImpactContext
Multitasking with email/phone-10 pointsEquivalent to missing a full night of sleep
Smoking marijuana-5 pointsHalf the effect of multitasking
Sleep deprivation (one night)-10 pointsSame ballpark as multitasking
Normal, focused workBaselineFull cognitive capacity available
Condition
Multitasking with email/phone
IQ Impact
-10 points
Context
Equivalent to missing a full night of sleep
Condition
Smoking marijuana
IQ Impact
-5 points
Context
Half the effect of multitasking
Condition
Sleep deprivation (one night)
IQ Impact
-10 points
Context
Same ballpark as multitasking
Condition
Normal, focused work
IQ Impact
Baseline
Context
Full cognitive capacity available

Read that again. The IQ impact of constant task-switching was twice the cognitive impairment of smoking marijuana. And while the marijuana study was always cited as alarming, nobody was running around telling knowledge workers that their inbox habit was twice as bad for their brains.

The IQ effect is temporary. Your intelligence isn't permanently reduced by checking email. But the performance impact accumulates across hours, days, and weeks. If you spend most of your workday in a state of chronic switching, you're spending most of your workday cognitively impaired by a full standard deviation.

Glenn Wilson, the psychiatrist who led the study, called it "infomania." But the mechanism isn't really about information overload. It's about the constant engagement and disengagement of prefrontal task sets, each switch burning cognitive fuel and leaving behind a residue of the previous task that interferes with the current one.

Attention Residue: The Ghost of the Task You Just Left

That residue has a name. Sophie Leroy, a business school professor at the University of Washington, coined the term attention residue in 2009, and it describes one of the most important and underappreciated phenomena in cognitive science.

When you switch from Task A to Task B, part of your attention remains stuck on Task A. You're thinking about the email you didn't finish. You're still processing the Slack conversation you just closed. The neural representation of Task A doesn't cleanly deactivate. It lingers, like the afterimage of a bright light, consuming working memory resources that Task B desperately needs.

Leroy's experiments showed that people who switched tasks performed significantly worse on the new task than people who had completed the previous task before moving on. And the effect persisted even when participants were told to completely forget about the previous task. Your prefrontal cortex doesn't take instructions about what to forget very well. It keeps processing.

This is why a day of "multitasking" leaves you exhausted even if you feel like you didn't accomplish much. Your brain has been working incredibly hard. It just hasn't been working efficiently. It's been burning fuel on constant reconfiguration and residue management rather than on actual productive thought.

The Attention Residue Test

Next time you switch tasks, pause for a moment and notice what's on your mind. Are you fully present with the new task, or is part of your brain still chewing on what you just left? That lingering preoccupation is attention residue, and it's one of the primary reasons multitasking degrades your work quality. The fix is counterintuitive: spending an extra minute completing or deliberately parking the previous task (writing a quick note about where you left off) actually saves you time by reducing the residue that follows you.

Dual-Task Interference: When Your Brain Literally Can't

The bottleneck theory explains why you can't do two complex things at once. But there's an even more specific phenomenon that shows exactly how the brain fails when you try. It's called dual-task interference, and neuroimaging studies have made it visible.

In 2001, neuroscientist Etienne Koechlin and colleagues at the French National Institute of Health and Medical Research published a landmark study in Science. They used fMRI to watch what happens in the brain when people try to pursue two goals simultaneously.

What they found was that the brain does something clever but limited. When you hold a single goal in mind, both hemispheres of your prefrontal cortex work together on it. When you add a second goal, the brain splits the work: the left prefrontal cortex takes one goal, and the right takes the other.

This sounds like it might work. And for two goals, it sort of does, though with reduced performance on both. But here's the critical finding: when a third goal was added, performance collapsed. The brain had run out of hemispheres. There was nowhere to put the third goal. People started making errors. They forgot tasks. They lost track of their priorities.

Your prefrontal cortex, it turns out, can juggle two goals (poorly) but cannot handle three. The architecture simply doesn't support it. This isn't a matter of practice or intelligence. It's a hardware limitation.

  • One goal: both prefrontal hemispheres collaborate. Full cognitive power.
  • Two goals: hemispheres split the work. Performance drops on both tasks.
  • Three or more goals: the system overflows. Errors skyrocket, tasks get dropped.
  • Every additional task beyond two doesn't get degraded processing. It gets no processing at all until you switch to it.

This finding has a direct implication for how you work. If you think you're simultaneously tracking your code, your Slack messages, your email, and a background meeting, you're not. Your brain can genuinely track at most two of those (badly). The rest are just sitting in a queue, accumulating attention residue and switch costs every time your brain flips to check on them.

The Exception That Proves the Rule: What You CAN Do Simultaneously

Now, there's an important caveat here. You can obviously walk and talk at the same time. You can listen to music while cooking. You can chew gum and do, well, anything.

Does that mean the bottleneck theory is wrong?

No. It means there are different types of processing, and the bottleneck only applies to one of them.

Cognitive scientists distinguish between controlled processing (tasks that require conscious attention and prefrontal involvement) and automatic processing (tasks that have been practiced so extensively they run without conscious oversight, handled by subcortical structures like the basal ganglia and cerebellum).

Walking is automatic. You learned it as a toddler and have practiced it for decades. It doesn't need your prefrontal cortex. Talking requires controlled processing, but since walking doesn't compete for the same neural resources, you can do both.

The trouble starts when both tasks require controlled processing. Talking on the phone while driving. Writing an email while listening to a presentation. Reading while having a conversation. These pairs both need the prefrontal bottleneck, and they collide.

This is why hands-free phone conversations while driving are just as dangerous as handheld ones. The problem was never that your hand was occupied. The problem is that the conversation and the driving are both competing for the same prefrontal resources. Your brain can handle the automatic parts of driving (staying in the lane on a straight, empty road) while talking. But the moment something unexpected happens, the thing that needs controlled processing to react to, you're in trouble. Your response time plummets because your prefrontal cortex is configured for conversation, not emergency braking.

A 2006 study at the University of Utah found that drivers talking on cell phones had the reaction times of 70-year-olds. And they drove worse than legally drunk drivers.

Neurosity Crown
Brainwave data, captured at 256Hz across 8 channels, processed on-device. The Crown's open SDKs let developers build brain-responsive applications.
Explore the Crown

The 2.5%: Are Supertaskers Real?

In 2010, two researchers at the University of Utah, David Strayer and Jason Watson, published a finding that shook up the field. They had been running their standard dual-task experiment, having people perform a demanding driving simulation while simultaneously doing a complex working memory task (OSPAN), and the data was doing exactly what decades of research predicted. Performance tanked on both tasks.

Except for a handful of participants.

Out of 200 people tested, five showed zero performance decline on either task when doing both simultaneously. No drop in driving accuracy. No drop in working memory. Nothing. Their brains appeared to be doing what neuroscience said brains couldn't do.

Strayer and Watson called them supertaskers, and the 2.5% prevalence rate has held up across subsequent studies.

So what's different about their brains? The honest answer is: we don't fully know yet. Early neuroimaging work suggests that supertaskers have more efficient prefrontal cortex function. Their brains appear to use fewer neural resources to accomplish the same tasks, leaving room for a second task set without the usual interference.

Supertasker Facts

Here's what the research has established so far about the 2.5%:

  • They show no measurable performance decline on either task during dual-task conditions
  • They tend to have higher baseline working memory capacity
  • Brain imaging shows more efficient (less, not more) prefrontal activation during complex tasks
  • The trait appears to be stable, not something that comes and goes
  • Self-assessment is useless for identifying supertaskers. Most people who think they're good at multitasking are actually the worst at it. The Dunning-Kruger effect applies to multitasking with a vengeance.

That last point deserves emphasis. A study by Strayer's lab found a negative correlation between how often someone multitasked and how well they performed on dual-task tests. The people who multitasked most frequently were the worst at it. They were self-selecting into a behavior that maximally exploited their weakness.

The people who were actually good at multitasking? They rarely felt the need to do it.

What Your Brainwaves Reveal About Task-Switching

Everything we've discussed so far becomes visible in real-time when you look at the brain's electrical activity with EEG.

When a person is deeply focused on a single task, their brain produces a characteristic pattern. Elevated beta activity (13-30 Hz) over frontal regions, reflecting active, engaged concentration. Strong frontal midline theta (4-8 Hz), associated with sustained attention and working memory maintenance. And moderate alpha suppression (8-13 Hz) over task-relevant sensory areas, meaning the brain is actively processing incoming information rather than idling.

Now watch what happens when that person is interrupted and forced to switch tasks.

The frontal beta power drops. Theta rhythms in the anterior cingulate cortex spike, reflecting cognitive conflict as the brain realizes it needs to reconfigure. The P300 component, a brainwave signal that peaks roughly 300 milliseconds after a stimulus and reflects the allocation of attentional resources, decreases in amplitude. Your brain is literally allocating less attention to each thing because it's dividing the pool.

And here's the part that makes this more than an academic curiosity: these changes are measurable with consumer-grade EEG. You don't need a million-dollar lab to see your own brain struggling with multitasking. You need electrodes over the frontal and parietal cortex, a decent sample rate, and the ability to read frequency-band data in real time.

Brainwave MarkerDuring Focused WorkDuring Task-Switching
Frontal beta (13-30 Hz)Elevated, sustainedDrops during transitions
Frontal midline theta (4-8 Hz)Steady, moderateSpikes during conflict/switch
P300 amplitudeStrong, clearReduced, divided
Alpha suppressionTargeted to task-relevant areasDiffuse, unfocused
Gamma coherence (30+ Hz)High between task-relevant regionsFragmented
Brainwave Marker
Frontal beta (13-30 Hz)
During Focused Work
Elevated, sustained
During Task-Switching
Drops during transitions
Brainwave Marker
Frontal midline theta (4-8 Hz)
During Focused Work
Steady, moderate
During Task-Switching
Spikes during conflict/switch
Brainwave Marker
P300 amplitude
During Focused Work
Strong, clear
During Task-Switching
Reduced, divided
Brainwave Marker
Alpha suppression
During Focused Work
Targeted to task-relevant areas
During Task-Switching
Diffuse, unfocused
Brainwave Marker
Gamma coherence (30+ Hz)
During Focused Work
High between task-relevant regions
During Task-Switching
Fragmented

This is where the science gets personal. Because those brainwave patterns aren't just data points in a research paper. They're happening inside your head right now, every time you glance at a notification, every time you toggle between apps, every time you convince yourself you're being productive by keeping twelve balls in the air.

Your Brain Doesn't Want to Multitask. So Why Can't You Stop?

If multitasking is so bad for us, why does it feel so natural? Why do we keep doing it even when we know, intellectually, that it doesn't work?

The answer involves a different part of the brain: the dopaminergic reward system.

Every time you check a notification, respond to a message, or switch to a new task, you get a small hit of dopamine. Not because the new task is inherently rewarding, but because novelty itself triggers dopamine release. Your brain evolved in an environment where new information could mean the difference between life and death. A new sound might be a predator. A new smell might be food. The brain rewards you for investigating novelty because for millions of years, investigation had survival value.

In the modern world, that novelty-seeking circuit gets hijacked by digital interruptions. Every ping, buzz, and notification is a tiny novelty signal that your dopamine system responds to. And the response feels good. It feels productive. You feel busy, engaged, on top of things.

But it's a trick. Your dopamine system is rewarding you for the exact behavior that your prefrontal cortex can't actually support. You're getting a hit of pleasure for each switch while paying a hidden tax in cognitive performance that you can't feel directly.

This is why willpower alone isn't enough to stop multitasking. You're fighting against a neurochemical reward loop that's been optimized over millions of years of evolution. The notification isn't just a distraction. It's a stimulus that your brain is chemically motivated to investigate.

What Actually Works: Single-Tasking and the Neuroscience of Deep Focus

So what's the alternative? The research points clearly in one direction: single-tasking. Doing one thing at a time, with your full attention, for sustained periods.

This isn't a productivity hack. It's an alignment with how your brain actually works. When you single-task, you eliminate switch costs, prevent attention residue, keep both prefrontal hemispheres working on the same goal, and maintain the sustained beta and theta patterns associated with high-quality cognitive work.

The research on deep focus states, sometimes called "flow," shows the brain operating at peak efficiency during uninterrupted single-task work. Frontal beta power stays elevated. Cross-regional gamma coherence increases, meaning different brain areas are communicating more effectively. The default mode network (your brain's mind-wandering system) quiets down. Everything aligns.

Here are evidence-based strategies for protecting single-task focus:

  • Time-block your day into focused work periods of 60-90 minutes. This matches the brain's natural ultradian rhythm of high and low alertness.
  • Remove notification triggers during focused work. Every notification is a novelty stimulus your dopamine system will push you to investigate.
  • Use a 'parking lot' note for intrusive thoughts about other tasks. Writing them down reduces attention residue because your brain can stop trying to hold them in working memory.
  • Batch similar tasks together (all emails, then all code review, then all writing) to minimize the rule-activation cost of task-switching.
  • Practice completing one task to a clear stopping point before starting another. Unfinished tasks generate stronger attention residue than completed ones (the Zeigarnik effect).

Seeing Your Own Attention: Where Neuroscience Meets Real-Time Feedback

Here's what's frustrating about the myth of multitasking: knowing it's a myth doesn't make it easy to stop. You can read all the research, nod along, and then immediately go back to toggling between six tabs. The intellectual understanding doesn't reach the behavior because the dopamine reward loop is operating below conscious awareness.

This is where neurofeedback becomes interesting. The idea is straightforward: if you could actually see your brain losing focus, see the frontal beta dropping and the conflict theta spiking, in the moment it happens, you'd have a feedback signal that competes with the dopamine hit. You'd know, not intellectually but experientially, that the switch just cost you something.

The Neurosity Crown sits at the intersection of this research and practical application. With 8 EEG channels positioned at CP3, C3, F5, PO3, PO4, F6, C4, and CP4, it covers the frontal regions where attention and task-switching play out and the parietal regions involved in attentional orienting. Sampling at 256Hz, it captures the frequency-band dynamics, the beta fluctuations, the theta spikes, the alpha patterns, that reveal what your attention is actually doing, not what you think it's doing.

The Crown's focus and calm scores translate these raw brainwave patterns into something you can act on in real time. When you're deep in single-task work and your focus score is high, you have objective confirmation that your brain is in a productive state. When it drops, you have an early warning that you've been pulled out, often before you consciously realize it.

For developers, the possibilities go further. The Crown's JavaScript and Python SDKs expose raw EEG data, power-by-band breakdowns, and the focus and calm metrics as programmable signals. You could build an app that detects the neural signature of a task-switch (the theta spike, the beta drop) and logs it. Over a week, you'd have a precise map of when, how often, and how expensively you switch tasks. That's not a productivity theory. That's measurement.

And with the Crown's MCP integration for AI tools, you could connect your real-time brain state data to an AI assistant that learns your attention patterns. An assistant that knows when you're in flow and holds your notifications, or that notices your focus degrading and suggests a break before you burn out. This isn't hypothetical. It's buildable today.

The Uncomfortable Truth About Your "Productive" Day

Here's the question worth sitting with: if multitasking is an illusion, and most of us spend most of our workdays doing it, what does that mean about how much actual deep thinking we're doing?

The research suggests the answer is: not much. Gloria Mark, a professor of informatics at UC Irvine, found that the average knowledge worker switches tasks every three minutes and five seconds. After an interruption, it takes an average of 23 minutes and 15 seconds to fully return to the original task. Run those numbers across an eight-hour day and the math is brutal. If you're switching every three minutes but need 23 minutes to fully re-engage, you may never reach full depth on anything.

We've built an entire work culture around the assumption that the human brain can process multiple streams of complex information simultaneously. Open-plan offices, Slack channels, always-on email, concurrent meetings. It's all designed for a brain that doesn't exist.

Your brain is an extraordinary machine. It contains roughly 86 billion neurons forming trillions of connections, running computations that the most powerful supercomputers can't replicate. But it has a bottleneck. A single lane where complex decisions have to queue up and wait their turn. That bottleneck isn't a flaw. It's a feature. It's what allows you to bring the full power of that extraordinary machine to bear on one thing at a time.

The myth of multitasking flatters us into thinking we can outperform our own neurology. The neuroscience says otherwise. The brain that focuses on one thing, deeply and completely, will always outperform the brain that tries to do everything at once.

The question isn't whether your brain can multitask. Fifty years of research have answered that. The question is: now that you know it can't, what will you do differently?

Your brain has been telling you this the whole time. Maybe it's time you listened.

Stay in the loop with Neurosity, neuroscience and BCI
Get more articles like this one, plus updates on neurotechnology, delivered to your inbox.
Frequently Asked Questions
Is multitasking a myth?
For complex cognitive tasks, yes. Neuroscience shows that the brain cannot truly process two demanding tasks simultaneously. What feels like multitasking is actually rapid task-switching, where your prefrontal cortex toggles between task sets. Each switch incurs a cost in time (up to 40% productivity loss) and accuracy. The only exception is pairing an automatic task (like walking) with a cognitive task (like talking), because automatic tasks use different neural pathways.
What is the cognitive cost of task-switching?
Every time you switch tasks, your brain must deactivate the neural rules for the previous task and load the rules for the new one. This process takes anywhere from a few tenths of a second to several seconds, and errors increase by up to 50%. Over a full workday of chronic switching, research estimates you lose the equivalent of an entire workday per week to switch costs alone.
Can multitasking lower your IQ?
A study at the Institute of Psychiatry at King's College London found that multitasking with electronic media reduced participants' effective IQ by an average of 10 points. For comparison, smoking marijuana typically reduces IQ by about 5 points, and losing a full night of sleep reduces it by about 10 points. The IQ effect of multitasking is temporary, but the productivity damage accumulates.
What are supertaskers?
Supertaskers are the roughly 2.5% of the population who can genuinely perform two complex tasks at once without measurable performance loss. Discovered by researchers David Strayer and Jason Watson at the University of Utah in 2010, supertaskers show no decline in either driving performance or working memory tasks when done simultaneously. Their brains appear to have more efficient prefrontal cortex function, but scientists are still studying why.
How can neurofeedback help with focus and task-switching?
Neurofeedback trains your brain to recognize and maintain states of focused attention by showing you your own brainwave activity in real-time. EEG devices can measure beta wave activity (associated with active concentration) and alpha waves (associated with calm focus), allowing you to practice sustaining attention without switching. Research shows this training can improve sustained attention and reduce distractibility.
What does EEG reveal about multitasking in the brain?
EEG studies show that when people attempt to multitask, there is a measurable decrease in the P300 event-related potential, a brainwave signal linked to attention and working memory. Frontal beta power drops, and there is increased theta activity in the anterior cingulate cortex, reflecting cognitive conflict. These patterns confirm that the brain is struggling to allocate resources, not smoothly handling parallel tasks.
Copyright © 2026 Neurosity, Inc. All rights reserved.