Neurosity
Open Menu
Guide

The Best Crown Projects Developers Are Building

AJ Keller
By AJ Keller, CEO at Neurosity  •  February 2026
The Neurosity Crown community is producing brain-controlled smart homes, AI assistants that adapt to your mental state, neurofeedback protocols, generative art, and tools that would've been science fiction five years ago.
The best evidence that a computing platform matters isn't what the company builds. It's what strangers build on it without being asked. The Crown developer community is turning raw brainwave data into productivity tools, accessibility interfaces, meditation systems, and creative experiments that push the boundary of what's possible with a non-invasive BCI.
Get the Crown
8-channel EEG with JavaScript and Python SDKs

The Best Proof of a Platform Is What People Build on It Without Being Asked

You can tell a lot about a computing platform by reading its spec sheet. Channels, sample rate, SDK documentation, hardware design. All important.

But you can tell everything about a platform by looking at what happens after the spec sheet gets into the hands of people who weren't on the engineering team. People who didn't attend the design meetings. People who took the hardware home, read the docs, opened a code editor, and thought: "What if I tried something nobody intended?"

This is the moment that separates a product from a platform. The iPhone became the iPhone not when Apple shipped it, but when a solo developer in a basement built an app that made Steve Jobs say "we didn't think of that." Arduino became Arduino when artists and biologists and architects started wiring it into things the Italian engineers never imagined. Linux became Linux when Linus Torvalds stopped being able to keep track of what people were doing with his kernel.

The Neurosity Crown is reaching that moment right now.

Developers are taking an 8-channel EEG headset with open JavaScript and Python SDKs and building things that range from "obviously useful" to "wait, you can do that with brain data?" Productivity tools that know when you're actually focused. AI assistants that read your cognitive state before deciding how to help. Smart homes that respond to your mental state instead of voice commands. Art installations that paint with brainwaves. Accessibility tools for people who can't use traditional input devices.

None of these were on a product roadmap. They emerged from a community of builders who saw raw brain data flowing through a WebSocket and thought: "I know exactly what to do with this."

Here's what they're building.

The Crown Developer Ecosystem: Why This Platform Breeds Builders

Before we look at specific project categories, it's worth understanding why the Crown has attracted the developer community it has. It's not accidental.

Most EEG devices on the market treat developers as an afterthought. The hardware ships with a closed app. Maybe there's a research API buried in documentation somewhere. Maybe you can export CSV files after a session. Good luck building anything real-time with that.

The Crown took the opposite approach. From the start, Neurosity designed the device as a development platform that happens to also ship with consumer applications. The JavaScript SDK gives you real-time access to everything: raw EEG data at 256Hz across all 8 channels, frequency band power (delta, theta, alpha, beta, gamma), focus and calm scores computed by on-device machine learning, kinesis (trainable mental commands), accelerometer data, and signal quality metrics. All of it streaming through a clean, well-documented API.

The Python SDK opens the same data to the data science and research community. BrainFlow and Lab Streaming Layer (LSL) integrations mean researchers who already have analysis pipelines can plug the Crown in without rewriting anything.

And then there's MCP.

The Model Context Protocol integration is arguably the most forward-looking feature in the entire ecosystem. It lets AI tools like Claude and ChatGPT query your brain state in real time. Your focus level, your calm score, your raw frequency bands, all available to an AI assistant that can then adapt its behavior accordingly. This isn't a theoretical capability. Developers are building with it right now.

Add a Discord community where developers help each other debug WebSocket connections at midnight, a GitHub organization full of starter templates and example projects, and a developer program that actively supports builders, and you get an ecosystem where someone can go from "I just unboxed this thing" to "I have a working brain-powered app" in a single afternoon.

That's the soil. Here's what's growing in it.

The Crown Developer Stack

Hardware: 8-channel EEG, 256Hz sample rate, N3 on-device chipset, Bluetooth connectivity

SDKs: JavaScript (Node.js, browser, React Native), Python

Data Streams: Raw EEG, frequency bands, focus score, calm score, kinesis, accelerometer, signal quality

Integrations: BrainFlow, Lab Streaming Layer, MCP (Model Context Protocol) for AI tools

Community: Discord server, GitHub organization, developer program, starter templates

Productivity Dashboards: Your Brain's Performance Review

The most immediately practical category of Crown projects is the productivity dashboard. The concept is simple but powerful: instead of guessing whether you're in a productive flow state or just staring at your screen with your jaw slack, you have real data.

Developers are building dashboards that subscribe to the Crown's focus and calm score streams and visualize them over time. Think of it like a heart rate monitor for your attention. You can see when your focus peaks, when it crashes, how long your productive windows actually last (spoiler: they're shorter than you think), and what patterns emerge across days and weeks.

The technical approach is straightforward. The Crown SDK exposes a focus() observable that emits a score between 0 and 1 at regular intervals. Same for calm(). A developer subscribes to these streams, stores the data points with timestamps, and renders them in a web dashboard using something like D3.js, Chart.js, or even a simple React app with SVG charts.

But the interesting projects go further. Some developers are correlating brain data with external signals. What's your focus score when you have Slack open versus when it's closed? How does your calm score change after your third cup of coffee? What happens to your brainwave patterns during a meeting versus during solo deep work?

Building Your Own Focus Dashboard

The fastest way to build a productivity dashboard is with the Crown's JavaScript SDK and a simple React app. Subscribe to the focus() and calm() streams, store the data in a local database like IndexedDB, and use any charting library to visualize trends. The entire MVP can be built in under 200 lines of code. The SDK handles all the signal processing on-device, so you get clean, normalized scores without touching any raw EEG math.

The "I had no idea" insight that keeps emerging from these dashboards: people's subjective sense of their own focus is wildly inaccurate. Developers report discovering that their perceived "most productive hours" are often not their actual best focus periods. The data shows patterns the conscious mind simply can't detect. That gap between perceived and actual cognitive performance is, by itself, a reason to build one of these dashboards.

Neurofeedback Training Protocols: Teaching Your Brain to Perform

If productivity dashboards are about observation, neurofeedback projects are about intervention.

Neurofeedback is conceptually elegant. You measure a brain signal, you present that measurement back to the user as audio or visual feedback, and the brain gradually learns to modify its own activity. It's biofeedback for your neurons. The research literature on neurofeedback goes back decades, with studies showing effects on attention, anxiety, sleep quality, and even creative performance.

Crown developers are building neurofeedback protocols that target specific frequency bands. The most common: alpha/theta training for relaxation and creative states, beta uptraining for sustained attention, and SMR (sensorimotor rhythm) training for calm alertness.

The technical stack typically involves subscribing to the Crown's brainwaves() stream at the raw or frequency-band level, computing power values for the target band in real time, and translating those values into feedback. That feedback could be a tone that gets louder when your alpha power increases, a visualization that gets brighter when your beta is in range, or even a game character that moves faster when your brain hits the target state.

What makes the Crown particularly suited for this is the 8-channel coverage. With electrodes at CP3, C3, F5, PO3, PO4, F6, C4, and CP4, you've got data from all four lobes. That means you can build protocols that target frontal attention (F5, F6), parietal relaxation (PO3, PO4), or motor cortex regulation (C3, C4, CP3, CP4) depending on what you're training.

Some developers are building protocols inspired by published research papers and adapting them for home use. Others are experimenting with entirely novel approaches. One common pattern: adaptive difficulty, where the neurofeedback threshold adjusts based on the user's recent performance so the training stays challenging but achievable.

Brain-Controlled Smart Homes: Your House, Your Mental State

This is the category that makes people's eyes go wide.

The idea: instead of telling your smart home what to do with voice commands or phone apps, it responds to your cognitive state automatically. Your Crown detects that you're entering a deep focus state, and the lights dim, the thermostat adjusts, and your notification-capable devices go silent. You start meditating, and the ambient lighting shifts to warm tones. Your stress levels spike during a work call, and the system queues calming music for when the call ends.

The technical approach combines the Crown SDK with smart home platforms. Developers are bridging brainwave data to Home Assistant, IFTTT, MQTT brokers, and native smart home APIs. The Crown's JavaScript SDK runs on Node.js, which means a Raspberry Pi can serve as the bridge between your brain and your home. Subscribe to focus and calm scores, set thresholds, and trigger automations.

The more sophisticated implementations don't just use simple thresholds. They track state transitions. There's a big difference between "focus score dropped below 0.3" and "focus score has been declining for the last 15 minutes." The first might be a momentary distraction. The second suggests you need a break, and your smart home could respond accordingly.

Neurosity Crown
The Neurosity Crown gives you real-time access to your own brainwave data across 8 EEG channels at 256Hz, with on-device processing and open SDKs.
See the Crown

AI Assistants That Read Your Brain State: The MCP Projects

This is the category that feels most like the future arriving early.

The Neurosity MCP server exposes your brain state data to AI tools that support the Model Context Protocol. In practice, this means you can have a conversation with Claude or ChatGPT, and the AI knows whether you're focused, distracted, calm, or stressed, in real time.

Think about what that enables.

A developer builds an AI coding assistant that checks your focus score before deciding whether to interrupt you with a suggestion. If you're in deep flow, it queues the suggestion for later. If your focus has been drifting for the last ten minutes, it might gently offer a context switch.

Another developer creates an AI writing partner that adjusts its level of detail based on your cognitive state. Deep focus? It gives you the short, dense answer. Scattered and fatigued? It breaks things down step by step with more scaffolding.

The technical implementation is surprisingly accessible. The Neurosity MCP server runs locally and exposes your Crown data through the standard MCP protocol. Any AI tool that supports MCP can query it. Developers are building custom system prompts that instruct the AI how to interpret and respond to different brain states.

This is genuinely new territory. Nobody has ever built AI assistants that have real-time access to the user's neurological state. The projects emerging in this space are equal parts practical and philosophical, because they raise real questions about what it means for an AI to "understand" how you're thinking.

Project CategoryPrimary SDK FeaturesTypical Tech Stack
Productivity Dashboardsfocus(), calm(), brainwaves()React, D3.js/Chart.js, IndexedDB
Neurofeedback Protocolsbrainwaves(), powerByBand()Web Audio API, Canvas/WebGL, Node.js
Smart Home Controlfocus(), calm(), kinesis()Node.js, MQTT, Home Assistant, Raspberry Pi
AI Brain-State AssistantsMCP server, focus(), calm()MCP Protocol, Claude/ChatGPT, custom prompts
Meditation Trackerscalm(), brainwaves(), powerByBand()React Native, local storage, visualization libs
Creative/Art Installationsbrainwaves(), powerByBand(), kinesis()p5.js, Three.js, Processing, Arduino
Research ToolsRaw EEG, signal quality, BrainFlow/LSLPython, MNE-Python, Jupyter, pandas
Accessibility Interfaceskinesis(), focus(), accelerometer()Electron, OS accessibility APIs, Node.js
Music Generationbrainwaves(), powerByBand(), calm()Web Audio API, Tone.js, MIDI, Ableton Link
Gaming Prototypeskinesis(), focus(), accelerometer()Unity (via WebSocket), Godot, browser Canvas
Project Category
Productivity Dashboards
Primary SDK Features
focus(), calm(), brainwaves()
Typical Tech Stack
React, D3.js/Chart.js, IndexedDB
Project Category
Neurofeedback Protocols
Primary SDK Features
brainwaves(), powerByBand()
Typical Tech Stack
Web Audio API, Canvas/WebGL, Node.js
Project Category
Smart Home Control
Primary SDK Features
focus(), calm(), kinesis()
Typical Tech Stack
Node.js, MQTT, Home Assistant, Raspberry Pi
Project Category
AI Brain-State Assistants
Primary SDK Features
MCP server, focus(), calm()
Typical Tech Stack
MCP Protocol, Claude/ChatGPT, custom prompts
Project Category
Meditation Trackers
Primary SDK Features
calm(), brainwaves(), powerByBand()
Typical Tech Stack
React Native, local storage, visualization libs
Project Category
Creative/Art Installations
Primary SDK Features
brainwaves(), powerByBand(), kinesis()
Typical Tech Stack
p5.js, Three.js, Processing, Arduino
Project Category
Research Tools
Primary SDK Features
Raw EEG, signal quality, BrainFlow/LSL
Typical Tech Stack
Python, MNE-Python, Jupyter, pandas
Project Category
Accessibility Interfaces
Primary SDK Features
kinesis(), focus(), accelerometer()
Typical Tech Stack
Electron, OS accessibility APIs, Node.js
Project Category
Music Generation
Primary SDK Features
brainwaves(), powerByBand(), calm()
Typical Tech Stack
Web Audio API, Tone.js, MIDI, Ableton Link
Project Category
Gaming Prototypes
Primary SDK Features
kinesis(), focus(), accelerometer()
Typical Tech Stack
Unity (via WebSocket), Godot, browser Canvas

Meditation and mindfulness-based stress reduction Trackers: Quantified Calm

Meditation apps are everywhere. But almost all of them measure the same thing: time. You sat for 10 minutes. Congratulations. Here's a streak badge.

Crown developers are building meditation tools that measure what's actually happening in your brain while you meditate. The calm score provides a high-level metric, but the real richness comes from the frequency band data. Alpha power (8-12 Hz) typically increases during relaxed, eyes-closed meditation. Theta power (4-8 Hz) rises during deeper meditative states. Some long-term meditators show elevated gamma (above 30 Hz), which is associated with states of heightened awareness and compassion.

The projects in this space range from simple session trackers that log your calm score over a 20-minute sit, to sophisticated tools that identify different meditation "depths" based on frequency band ratios and provide real-time audio cues when your mind wanders. Some developers are building React Native mobile apps so the tracking happens on a phone while the Crown does the sensing.

One particularly interesting pattern: developers building meditation tools that use the Crown's data to determine the optimal meditation duration for a specific person on a specific day. Instead of a fixed 10-minute timer, the app detects when your alpha power starts declining (suggesting your meditation is losing its effectiveness) and suggests ending the session. It's personalized meditation timing based on your actual neural activity.

Creative and Art Installations: Painting With Brain Waves

Artists and creative coders have seized on the Crown as an instrument for generative work. The concept: use brain data as a real-time input to creative algorithms, producing visual art, light installations, or interactive experiences that are shaped by the artist's (or audience's) cognitive state.

The most common approach uses the powerByBand() stream, which provides real-time power values for each frequency band across all 8 channels. Map alpha power to color hue, beta to brightness, theta to movement speed, gamma to particle density, and you get a visual output that is, quite literally, a picture of someone's brain state.

Developers working with p5.js and Three.js have built browser-based installations where visitors put on a Crown and watch abstract visuals evolve in real time based on their brainwaves. The visual becomes a kind of cognitive mirror. Relax and the imagery shifts to flowing, warm patterns. Concentrate and it becomes sharp and geometric. The audience isn't just viewing art. They're generating it with their nervous system.

Other projects use the Crown's kinesis feature, which lets users train custom mental commands. An artist trains two kinesis states, say "push" and "pull," and uses them to navigate through a 3D virtual environment. The navigation isn't controlled by a mouse or keyboard. It's controlled by trained thought patterns.

Some of the most ambitious installations combine the Crown with Arduino or Raspberry Pi hardware to control physical objects. Lights that pulse in sync with alpha brainwaves. Motors that spin faster as focus increases. Sculptures that move in response to the collective brain state of multiple audience members wearing Crowns simultaneously.

Research Tools and Data Pipelines: Science Without the Lab

Academic researchers and independent scientists are building data collection and analysis tools that turn the Crown into a legitimate research instrument. The BrainFlow and Lab Streaming Layer integrations are critical here, because they allow the Crown to slot into existing research workflows without custom plumbing.

Developers are building tools for event-related potential (ERP) studies, where you need to synchronize brain data with precisely timed stimuli. The Crown's marker API lets you tag moments in the EEG stream with event codes, then analyze the brain's response to those events offline.

Python-based pipelines using MNE-Python (the standard open-source EEG analysis library) are emerging that automate the full workflow: connect to the Crown, record a session with event markers, preprocess the data (filtering, artifact rejection), compute ERPs or frequency analyses, and generate publication-ready figures. What used to require a $50,000 EEG system and a dedicated lab technician can now run on a laptop.

The signal quality isn't equivalent to a clinical-grade system with 64 or 128 channels. But for many research questions, especially those involving attention, arousal, meditation, and frequency-band power across cortical regions, 8 channels at 256Hz is more than sufficient. And the accessibility means researchers can run studies with larger sample sizes in naturalistic settings instead of artificial lab environments.

Accessibility Interfaces: BCI as It Was Always Meant to Be

This is the category that connects most directly to why brain-computer interfaces were originally invented.

The entire history of BCI research began with a goal: giving people who can't use their bodies to communicate a way to interact with computers using only their brain. Crown developers are carrying that mission forward.

The kinesis feature is central here. Users can train custom mental commands that the Crown recognizes. A person with limited mobility can train a "select" command and use it to navigate an interface. Combined with a scanning system that highlights options sequentially, this becomes a complete input method that requires zero physical movement.

Other accessibility projects use focus and calm scores as control signals. A sustained focus state above a threshold triggers an action. A calm state below a threshold sends an alert to a caretaker. These are simple concepts with profound implications for people whose traditional input options are limited.

The JavaScript SDK's cross-platform nature matters enormously here. Developers are building Electron apps that integrate with operating system accessibility APIs, which means Crown-based input can control any application on the computer, not just custom-built BCI software.

Music Generation: Composing With Your Cortex

There's something poetic about using the brain's own electrical rhythms to create music. The brain already operates in frequencies. Delta, theta, alpha, beta, gamma. These aren't metaphorical. They're oscillations with specific hertz values that map naturally to musical parameters.

Crown developers are building instruments that translate brainwave data into sound in real time. The Web Audio API and libraries like Tone.js make this accessible from the browser. Map your alpha frequency to a synth pad. Use theta power to control a reverb's wet/dry mix. Let gamma bursts trigger percussive hits.

Some projects go further, using the Crown's data not as a direct sound generator but as a conductor. The brain state influences a generative music algorithm's parameters, so the composition evolves based on how you're feeling rather than through direct note-by-note control. The result is music that's partially composed by your neural activity and partially by the algorithm, a genuine collaboration between human brain and machine.

MIDI output is another common approach. Developers route Crown data through a MIDI converter and into Ableton Live, Logic Pro, or any other digital audio workstation. This means every existing software synthesizer and effect becomes brain-controllable. The Crown essentially becomes a new MIDI controller, except the input isn't your fingers on keys. It's the electrical activity of your cortex.

Gaming Prototypes: Play With Your Mind

The gaming category is still early, but the prototypes are tantalizing.

The most common approach uses the kinesis feature to create a mental command vocabulary for game control. Train "push" and "pull" as two mental commands, and you can play a simple game where concentration moves your character forward and relaxation brings them back. Add focus score as a modifier (higher focus means faster movement), and you have a game that rewards mental discipline rather than button-mashing reflexes.

Browser-based games using the Canvas API are the most common platform, because the JavaScript SDK makes them trivially easy to connect. But some developers are building WebSocket bridges to Unity and Godot, enabling brain data to flow into full game engines. That opens up 3D environments, physics simulations, and multiplayer experiences.

The most interesting gaming experiments combine traditional input with brain data. A standard game controlled with keyboard and mouse, but with brain-state overlays. Your character's abilities strengthen when your focus is high. The game world becomes more chaotic when your stress levels rise. It's not replacing traditional control. It's adding a new dimension to it.

The Developer Community Advantage

One thing that sets Crown gaming projects apart from other BCI gaming experiments: because the SDK is JavaScript-native, every web developer is already a potential BCI game developer. There's no learning a new language or wrestling with unfamiliar toolchains. If you can build a web app, you can build a brain-controlled web app. The community on Discord is full of developers who've already solved the common integration challenges and are happy to share what they've learned.

Getting Started: Building Your First Crown Project

If you've read this far and you're thinking "I want to build something," here's the path.

Step 1: Get the hardware. You need a Neurosity Crown. There's no way around this. You can study the SDK docs without one, but you can't test anything meaningful without actual brain data.

Step 2: Set up the SDK. Install @neurosity/sdk via npm for JavaScript projects, or the Python package for Python projects. The developer documentation walks through authentication and device pairing.

Step 3: Start with the simplest possible thing. Subscribe to focus(). Log the values to the console. Watch your own focus score change in real time as you concentrate and relax. This takes about 10 lines of code, and it's the moment where brain-computer interaction stops being abstract and becomes visceral.

Step 4: Pick a direction. Look at the project categories above. Which one makes you lean forward? Build the smallest possible version of that. A single web page that changes color based on your calm score. A Node.js script that sends a notification when your focus drops. A p5.js sketch that draws based on your brainwaves. Start small. Ship something.

Step 5: Join the community. The Neurosity Discord is where developers share projects, debug issues, and inspire each other. The fastest way to level up your BCI development skills is to see what other people have built and ask them how they did it.

Step 6: Share what you build. Open-source your project on GitHub. Write about what you learned. Show a demo on Discord. Every project you share makes the ecosystem richer and helps the next developer who has the same idea.

The Projects That Haven't Been Built Yet

Here's what's remarkable about the current state of Crown community projects: we're still in the first chapter.

Think about what the iPhone app ecosystem looked like one year after the App Store launched. Most apps were flashlights and fart sound generators. Nobody had built Uber yet. Nobody had built Instagram. Nobody had imagined that a phone with a GPS chip and a camera would become the foundation for entirely new categories of human activity.

The Crown developer community is at a similar inflection point. The projects that exist today are impressive, clever, sometimes beautiful. But the brain generates an astonishing volume of data. Eight channels at 256Hz means 2,048 data points per second, every second, continuously, from the most complex object in the known universe. We are nowhere near exhausting what's possible with that data stream.

Somewhere, a developer is going to figure out that brain data combined with some other signal, calendar data, time of day, ambient noise, heart rate, whatever, produces a combined insight that neither signal could generate alone. Somewhere, someone is going to build a Crown application that becomes a daily-use tool for millions of people, the way nobody predicted that a phone accelerometer would become the foundation for fitness tracking.

The SDK is open. The data is streaming. The community is active and generous.

The question isn't whether someone will build the app that makes brain-computer interaction feel as natural as touching a screen. The question is whether it'll be you.

Stay in the loop with Neurosity, neuroscience and BCI
Get more articles like this one, plus updates on neurotechnology, delivered to your inbox.
Frequently Asked Questions
What can I build with the Neurosity Crown?
The Crown's JavaScript and Python SDKs give you access to raw EEG at 256Hz, frequency band power, focus and calm scores, kinesis (mental command) training, and accelerometer data. Developers have built productivity dashboards, neurofeedback training apps, brain-controlled smart home systems, AI assistants that adapt to cognitive state via MCP, meditation trackers, generative art installations, music generators, gaming prototypes, accessibility interfaces, and research data collection tools.
What programming languages work with the Neurosity Crown?
The Crown has official SDKs for JavaScript (supporting Node.js, browser, and React Native) and Python. The JavaScript SDK is the most mature, with full access to all data streams, device management, and real-time subscriptions. The Crown also integrates with BrainFlow and Lab Streaming Layer (LSL) for researchers who prefer those ecosystems.
How does the Neurosity MCP integration work with AI?
The Neurosity MCP (Model Context Protocol) server lets AI tools like Claude and ChatGPT access your real-time brain state data. Once configured, AI assistants can query your focus level, calm score, raw frequency data, and other cognitive metrics. This enables AI workflows that adapt their behavior based on whether you're deeply focused, distracted, calm, or stressed.
Do I need programming experience to build Crown projects?
Basic JavaScript or Python knowledge is enough to get started. The Neurosity SDK handles the complex signal processing on-device through the N3 chipset, so you don't need a neuroscience background. The developer documentation, starter templates on GitHub, and the active Discord community make it possible to go from unboxing to running your first brain-powered app in an afternoon.
Where can I find other Neurosity Crown developers?
The Neurosity Discord server is the primary hub for developer discussion, project sharing, and troubleshooting. The Neurosity GitHub organization hosts SDK repositories, example projects, and starter templates. Many developers also share their projects on Twitter/X and personal blogs.
Is the Neurosity SDK free to use for community projects?
Yes. The Neurosity SDK is MIT-licensed, which means you can use it in personal projects, open-source repositories, commercial applications, or academic research without licensing fees. You need a Neurosity Crown device to access real brain data, but the SDK itself is completely free and open.
Copyright © 2026 Neurosity, Inc. All rights reserved.