Neurosity
Open Menu
Guide

Semantic Memory: How Your Brain Stores Everything It Knows

AJ Keller
By AJ Keller, CEO at Neurosity  •  January 2026
Semantic memory is the brain system that stores facts, concepts, word meanings, and general knowledge about the world. It operates independently of personal experience and produces distinct EEG patterns, particularly the N400 response, during retrieval.
You know that the Earth orbits the Sun, that dogs bark, and that the word 'justice' means something, even though you probably can't recall the specific moment you learned any of these things. This is semantic memory at work. It is the foundation of language, reasoning, and every thought you have that involves knowledge. And its brain signatures are some of the most well-studied phenomena in cognitive neuroscience.
Explore the Crown
Non-invasive brain-computer interface with open SDKs

You Know What a Zebra Is, But When Did You Learn That?

Think about it. You know what a zebra is. You know it's a horse-like animal with black and white stripes that lives in Africa. You know it's a mammal. You might know that no two zebras have the same stripe pattern.

Now try to remember the moment you learned any of this.

You can't. The knowledge exists in your mind, vivid and accessible, but the experience of learning it is gone. There's no classroom, no teacher's voice, no specific book page you can point to. The fact just... exists. It's part of your mental furniture, and you have no idea when it was delivered.

This is semantic memory, and it's one of the most quietly extraordinary things your brain does. It's the vast, organized warehouse of everything you know about the world, stripped of personal context, detached from time and place, sitting there ready to be activated at a moment's notice. Every conversation you have, every sentence you read, every judgment you make about the world draws on semantic memory. And almost none of it comes with a receipt.

The Knowledge That Lost Its Autobiography

Semantic memory was first defined by Endel Tulving in 1972, as a contrast to episodic memory. Episodic memory records your personal experiences. Semantic memory records what you know.

The distinction seems simple, but its implications are deep. Consider what happens when you learn a new word. The first time you encounter "thalamus" in a neuroscience textbook, the experience is episodic. You might remember the library where you were sitting, the feeling of the chair, the coffee going cold beside you. The fact ("the thalamus is a relay station in the brain") is wrapped in a personal experience.

But something happens over time. You use the word again. You encounter it in different contexts. Gradually, the episodic wrapper dissolves. The library vanishes. The cold coffee vanishes. What remains is the pure semantic content: thalamus, relay station, brain. The fact has been extracted from the experience like a butterfly from its chrysalis.

This process, called semanticization, is one of the most fundamental operations your memory system performs. It's how raw experience gets distilled into knowledge. And it's happening continuously. Every night, as your hippocampus replays the day's experiences during sleep, the cortex is extracting regularities, stripping context, and building the vast web of semantic knowledge that makes you a functional human being.

The Woman Who Lost the Meaning of Everything

The most dramatic evidence that semantic memory is a distinct brain system comes from patients who lose it.

In the early 1990s, neuropsychologists in Cambridge and Manchester began documenting a condition they called semantic dementia. Patients with this condition, caused by progressive atrophy of the anterior temporal lobes, gradually lose their knowledge of the world. Not their personal memories. Not their skills. Their knowledge.

The progression is haunting in its specificity. It typically begins with less common words and concepts. A patient might look at a picture of a harmonica and say, "I don't know what that is." Then more common concepts start to erode. A rhinoceros becomes "an animal." Then a dog becomes "a thing." Eventually, the patient can't understand what a cup is for, can't comprehend the meaning of simple words, can't recognize familiar objects.

But here's the astonishing part: their episodic memory can remain relatively intact. They can tell you what they did yesterday. They can describe their morning. They remember personal events. They just can't access the general knowledge that gives those events meaning.

Patient "E.T." (a pseudonym), studied extensively by John Hodges and Karalyn Patterson, could remember that she had gone to the store that morning but couldn't explain what a store was. She knew she had been somewhere. She could describe the layout. She just couldn't connect the experience to the concept.

This is the mirror image of what happened to patients like K.C. and H.M., who lost episodic memory while retaining semantic knowledge. Together, these cases proved that episodic and semantic memory are doubly dissociable. They can break independently. They run on different neural hardware.

The Category-Specific Deficit

Some patients with brain damage lose semantic knowledge for specific categories while retaining others. A famous case involved a patient who could identify and describe tools perfectly but could not recognize or name living things. Another patient showed the reverse pattern. These category-specific deficits suggest that semantic knowledge is organized in the brain partly by the type of sensory and motor information associated with each concept. Living things depend more on visual features. Tools depend more on functional and motor features. Damage to different cortical regions disrupts different categories.

Where Does Your Brain Keep All This Knowledge?

If episodic memory has the hippocampus as its central hub, where is the hub for semantic memory?

The answer took decades to work out, and it's elegant. Semantic memory uses a hub-and-spoke architecture.

The Spokes: Knowledge Lives Where It Was Learned

Different types of information about a concept are stored in the cortical regions that originally processed that information. Visual features of objects are stored in visual cortex regions. How objects sound is stored near auditory cortex. How objects move and how you interact with them is stored near motor and premotor cortex. These are the "spokes."

This is why thinking about a hammer activates motor cortex (because you interact with it through grasping and swinging). Thinking about a lion activates visual cortex strongly (because its appearance is its most distinctive feature). Thinking about a word's pronunciation activates language-related areas. The brain stores knowledge in the same systems that acquired it.

The Hub: The Anterior Temporal Lobe

But distributed storage creates a problem. How do you tie together the look of a dog, the sound of a bark, the feel of fur, the concept of "pet," and the word "dog" into a single, unified concept? Each of these features lives in a different cortical region. Something needs to bind them.

That something is the anterior temporal lobe (ATL), particularly the temporal pole. Research by Matthew Lambon Ralph and colleagues has established the ATL as a "transmodal hub" that integrates information across all modality-specific spokes into a coherent semantic representation.

This is why semantic dementia, which specifically attacks the ATL, destroys concepts so devastatingly. When the hub degrades, the spokes disconnect. The visual representation of a dog still exists in visual cortex. The sound of barking still exists in auditory cortex. But the connections between them are gone. The concept fragments.

Brain RegionRole in Semantic MemoryWhat Damage Causes
Anterior temporal lobeHub that integrates all features into unified conceptsSemantic dementia: progressive loss of word and concept meaning
Visual association cortexStores visual properties of conceptsDifficulty recognizing objects by sight
Motor/premotor cortexStores action-related knowledgeDifficulty understanding tool use and object function
Inferior frontal gyrusControlled semantic retrieval, selection among competing meaningsDifficulty retrieving specific knowledge when many options compete
Angular gyrusIntegration of multimodal information, thematic associationsDifficulty understanding relationships between concepts
HippocampusInitial learning of new semantic facts (before consolidation)Difficulty acquiring new semantic knowledge, preserved old knowledge
Brain Region
Anterior temporal lobe
Role in Semantic Memory
Hub that integrates all features into unified concepts
What Damage Causes
Semantic dementia: progressive loss of word and concept meaning
Brain Region
Visual association cortex
Role in Semantic Memory
Stores visual properties of concepts
What Damage Causes
Difficulty recognizing objects by sight
Brain Region
Motor/premotor cortex
Role in Semantic Memory
Stores action-related knowledge
What Damage Causes
Difficulty understanding tool use and object function
Brain Region
Inferior frontal gyrus
Role in Semantic Memory
Controlled semantic retrieval, selection among competing meanings
What Damage Causes
Difficulty retrieving specific knowledge when many options compete
Brain Region
Angular gyrus
Role in Semantic Memory
Integration of multimodal information, thematic associations
What Damage Causes
Difficulty understanding relationships between concepts
Brain Region
Hippocampus
Role in Semantic Memory
Initial learning of new semantic facts (before consolidation)
What Damage Causes
Difficulty acquiring new semantic knowledge, preserved old knowledge

The N400: Your Brain's Meaning Detector

If there's one electrical signal that defines semantic memory research, it's the N400.

Discovered by Marta Kutas and Steven Hillyard in 1980, the N400 is an event-related potential (ERP) component that peaks approximately 400 milliseconds after you encounter a meaningful stimulus, typically a word. It's a negative voltage deflection, most prominent over central and parietal electrode sites.

Here's what makes the N400 remarkable. Its amplitude is directly proportional to how surprising a word is in its context.

Read this sentence: "She spread the toast with socks."

Your brain just produced a large N400 to the word "socks." It was semantically unexpected. Your semantic memory system was anticipating something like "butter" or "jam," and when "socks" appeared, the mismatch generated a big N400.

Now read this: "She spread the toast with butter."

Small N400. Expected. Your semantic memory had already pre-activated the concept of butter before you even read it.

The N400 isn't just about sentences. It occurs for any semantic mismatch. Show someone a picture of a dog followed by the word "cat," and you get a bigger N400 than if you show the dog followed by the word "puppy." Play a chord that doesn't fit a musical sequence, and you get an N400. Present a face that doesn't match a name, and you get an N400.

This means the N400 is a direct, real-time readout of your semantic memory system in action. It tells you, 400 milliseconds after every meaningful stimulus, whether your brain's knowledge network found what it expected or was surprised. And EEG captures it beautifully.

Neurosity Crown
The Crown captures brainwave data at 256Hz across 8 channels. All processing happens on-device. Build with JavaScript or Python SDKs.
Explore the Crown

How Semantic Memory Organizes Knowledge

Your brain doesn't store facts in a random pile. Semantic memory is organized, and the organizational structure has real neural consequences.

Hierarchical Categories

Concepts are arranged in hierarchies. "Animal" is a superordinate category. "Dog" is a basic-level category. "Golden retriever" is a subordinate category. When you activate a concept at one level, it spreads activation to related levels. This is why you can answer "Is a dog an animal?" faster than "Is a dog a mammal?" even though both are true. The hierarchical link between "dog" and "animal" is stronger and more frequently traversed.

Semantic Networks

In the 1960s, Allan Collins and Ross Quillian proposed that semantic memory is organized as a network, with concepts as nodes and relationships as links. Activating one node spreads activation to connected nodes, which is why thinking about "doctor" makes you faster at recognizing the word "nurse." This spreading activation is measurable with EEG. Semantically related words produce smaller N400s (less surprise), and the degree of N400 reduction tracks the strength of the semantic relationship.

Feature-Based Representations

Concepts can also be described as bundles of features. A "bird" has features like "has wings," "can fly," "has feathers," "lays eggs." Some features are more important than others. You can remove "can fly" (penguins, ostriches) and still have a bird. But remove "has feathers" and the concept starts to break down.

The brain respects these feature weights. EEG studies show that violations of core features produce larger N400s than violations of peripheral features. Your brain knows which features matter most for each concept, and it reacts accordingly.

The Theta Band: Semantic Memory's Working Rhythm

Beyond the N400, semantic memory operations produce characteristic oscillatory patterns that EEG tracks across broader time windows.

Theta oscillations (4-8 Hz) play a central role. During semantic retrieval, theta power increases over frontal and temporal electrode sites. This theta increase is thought to reflect the controlled search through semantic networks, the process of navigating from a cue to a target concept through the web of interconnections.

When semantic retrieval is easy (highly related cue and target), theta increases are modest. When retrieval is difficult (weakly related or remote associations), theta increases are large and sustained. This makes theta a real-time index of semantic retrieval effort.

Alpha-band desynchronization (8-13 Hz) over temporal and parietal regions also accompanies semantic retrieval. As semantic processing demands increase, alpha power decreases, reflecting increased cortical excitability in regions involved in representing and integrating meaning.

The combination of frontal theta increase and posterior alpha decrease creates a distinctive EEG profile for semantic memory engagement. It's visible in real time, and it distinguishes semantic processing from other cognitive operations like spatial reasoning or motor planning.

How New Semantic Knowledge Gets Formed

Here's a question that kept memory researchers arguing for decades: if semantic memory is stored in the cortex, and the hippocampus is for episodic memory, how do you learn new facts?

The answer, it turns out, involves both systems working together across time.

When you first learn a new fact (say, "the platypus is venomous"), it enters memory as an episodic trace. You might remember reading it in this article, sitting wherever you're sitting right now. The hippocampus encodes this experience, binding the fact to its context.

Over hours, days, and weeks, something remarkable happens. Through repeated encounters and sleep-dependent consolidation, the fact gets extracted from its episodic context and integrated into the cortical semantic network. The hippocampal binding weakens. The cortical representation strengthens. Eventually, "the platypus is venomous" becomes a context-free fact, connected to your existing knowledge about platypuses, venom, and Australian wildlife, but detached from the moment of learning.

This is why patients with hippocampal damage can't learn new facts but retain old ones. The hippocampus is the entry point, but not the final storage site. It's like a loading dock: essential for receiving new shipments, but the inventory lives elsewhere.

This transition also explains a curious finding from EEG research. When you retrieve a newly learned fact (still hippocampus-dependent), the EEG pattern resembles episodic retrieval, with strong hippocampal theta and reinstatement of encoding-related activity. When you retrieve a well-established fact (cortex-dependent), the pattern shifts toward a more diffuse, theta-and-alpha-dominated profile typical of semantic processing. You can watch the memory change its neural signature as it transforms from episode to knowledge.

Semantic Memory and Language: The Partnership That Defines Human Cognition

Semantic memory and language are so deeply intertwined that it's hard to discuss one without the other. Language is the primary vehicle through which semantic knowledge is acquired, organized, and communicated. And semantic memory is what gives language its meaning.

Every word you know is a pointer into semantic memory. When you hear the word "freedom," your brain doesn't just recognize a sound pattern. It activates a vast web of associated concepts, experiences, values, and emotions. The N400 response to language reflects this activation, measuring how well each word fits into the semantic context built by the preceding words.

This is why reading comprehension isn't just about recognizing words. It's about continuously integrating each word into an evolving semantic representation. Good readers do this effortlessly. Readers with comprehension difficulties often show atypical N400 patterns, suggesting that their semantic integration process is disrupted.

The relationship goes both ways. Children who have larger vocabularies (more semantic knowledge) read better. Adults who read more build richer semantic networks. The system feeds itself: more knowledge enables more learning, which creates more knowledge.

The Resilience of Semantic Memory

Here's something reassuring. Of all the memory systems, semantic memory is the most durable.

While episodic memory starts declining in midlife, semantic memory often continues to grow well into old age. Your vocabulary is probably larger at 70 than it was at 20. Your general knowledge is broader. Your understanding of concepts is more nuanced.

This isn't just a consolation prize. It's a fundamental feature of how semantic memory works. Because semantic knowledge is distributed across the cortex in redundant, overlapping networks, it's resistant to localized damage. Losing a few neurons in visual cortex doesn't erase your knowledge of what a dog looks like, because that knowledge is represented across many neurons and multiple regions.

Episodic memory, by contrast, is bottlenecked through the hippocampus. Damage there has outsized effects. Semantic memory has no single point of failure.

This resilience has practical implications. When you meet an older person who can't remember what they had for breakfast but can discuss the intricacies of 18th-century French history, that's not a paradox. It's the predictable pattern of a brain in which episodic memory (hippocampus-dependent, vulnerable) is declining while semantic memory (cortex-distributed, resilient) holds strong.

Watching Your Knowledge Network in Real Time

The electrical signatures of semantic memory, the N400, theta oscillations, alpha desynchronization, are not subtle, hard-to-find signals. They're among the strongest and most reliable phenomena in all of cognitive neuroscience. The N400 has been replicated thousands of times across hundreds of labs over more than four decades.

The Neurosity Crown's electrode positions are well-suited for capturing these signals. Central channels (C3, C4) and centroparietal channels (CP3, CP4) sit in the scalp regions where the N400 is maximal. Frontal channels (F5, F6) capture the theta increases associated with controlled semantic retrieval. Parietal-occipital channels (PO3, PO4) detect the alpha desynchronization that accompanies deep semantic engagement.

With 256Hz sampling and on-device processing via the N3 chipset, these patterns can be tracked continuously, privately, and in real time. No gel, no lab, no wires.

The Library That Rewrites Itself

There's a temptation to think of semantic memory as static, like an encyclopedia that gets new entries but never revises old ones. That's wrong.

Semantic memory is constantly being updated, reorganized, and enriched. When you learn that Pluto was reclassified from a planet to a dwarf planet, your semantic network doesn't just add a new fact. It restructures relationships. "Pluto" shifts categories. "Planet" changes its extension. "Solar system" gets revised.

This continuous reorganization is what makes semantic memory so powerful. It's not a database. It's a living model of the world, shaped by every new piece of information, every conversation, every article you read (including this one). Right now, your semantic memory is updating itself, integrating the concepts of hub-and-spoke architecture, the N400, and theta-band retrieval into its existing web of knowledge about memory and the brain.

The next time you encounter any of these ideas, your brain will process them a little faster. Your N400 will be a little smaller. The concept will feel a little more familiar. Not because you remembered reading this article, but because the knowledge itself has become part of what you know.

That's semantic memory doing what it does best: turning experience into understanding, and understanding into the invisible architecture of thought.

Stay in the loop with Neurosity, neuroscience and BCI
Get more articles like this one, plus updates on neurotechnology, delivered to your inbox.
Frequently Asked Questions
What is semantic memory?
Semantic memory is the long-term memory system that stores general knowledge about the world, including facts, concepts, word meanings, and categories. Unlike episodic memory, which stores personal experiences with contextual detail, semantic memory is detached from the time and place of learning. You know that Paris is in France without remembering the specific moment you learned it. Semantic memory is distributed across the temporal, parietal, and frontal cortex.
What is the difference between semantic and episodic memory?
Semantic memory stores general facts and concepts without personal context (knowing that dogs are mammals). Episodic memory stores specific personal experiences with time and place information (remembering the day you got your first dog). Semantic memory involves noetic consciousness (simply knowing), while episodic memory involves autonoetic consciousness (mentally reliving). Semantic memory is more resilient to aging and hippocampal damage than episodic memory.
What is the N400 EEG response?
The N400 is an event-related potential (ERP) component that peaks approximately 400 milliseconds after encountering a word or concept. It is larger (more negative) for semantically unexpected or unrelated items and smaller for expected or related items. For example, the word 'dog' after 'The cat chased the' produces a large N400, while 'mouse' produces a small one. The N400 is one of the strongest and replicable ERP components in cognitive neuroscience and is considered a direct measure of semantic processing effort.
Where is semantic memory stored in the brain?
Semantic memory is distributed across multiple brain regions. The anterior temporal lobe, particularly the temporal pole, acts as a convergence zone or hub that integrates information from modality-specific cortical areas. Visual knowledge activates visual cortex regions, action knowledge activates motor cortex regions, and auditory knowledge activates auditory cortex regions. The hippocampus plays a role in initially forming semantic memories but becomes less important as knowledge is consolidated into cortical networks.
Can you lose semantic memory?
Yes. Semantic dementia, a form of frontotemporal dementia affecting the anterior temporal lobes, progressively erodes semantic knowledge. Patients gradually lose the meaning of words and concepts, often starting with less common items and progressing to everyday objects. They might look at a zebra and call it a horse, or lose the ability to understand what a fork is for. Semantic memory can also be impaired by stroke, herpes simplex encephalitis, and Alzheimer's disease in later stages.
Does semantic memory decline with age?
Semantic memory is relatively preserved with normal aging compared to episodic memory. Vocabulary size and general knowledge often continue to increase into the 60s and 70s. However, retrieval speed slows, and tip-of-the-tongue experiences become more common. The knowledge itself remains intact, but accessing it takes longer. This pattern contrasts with episodic memory, which begins declining in the 40s and 50s.
Copyright © 2026 Neurosity, Inc. All rights reserved.