Mind over machine: Controlling robots with thoughts alone

Imagine being able to control a walking robot—not with your hands, not with your voice, but with your brainwaves alone.
No screens, no joysticks—just your mental focus.
Picture the robot bringing you water, opening a door, or fetching your slippers. While that sounds futuristic, it’s not science fiction.
Robots are already transforming workspaces like warehouses and military zones. But there’s a growing interest in how they might help individuals at home.
Most existing systems rely on touch or voice commands. In a new pioneering study, researchers built and tested a brain-to-computer system and a mobile assistant robot that can be controlled wirelessly solely by human brainwaves.
This technology marks a major leap toward real-world robotics assisting people with limited mobility or severe physical impairments - and beyond!
What is a Brain-Computer Interface?
A brain-computer interface is a technology that decodes brain signals and translates them into control commands for external devices like computers, robotic arms, or—in this case—a walking robot.
Most use EEG (electroencephalography). By placing sensors on the scalp, EEG measures electrical brain activity, our brainwaves, in real-time.
While past studies have shown that users can control virtual cursors or stationary robotic arms with a brain-computer interface, this research is among the first to demonstrate thought-based control of a walking robot in an indoor real-world environment.
Importantly, the system does not require the user to stare at flashing lights or symbols—a common limitation in many other setups. This new system breaks with that tradition.
How the brain-controlled robot works
Current systems often need special implants or gel-based EEG caps, making them impractical for everyday use.
In this study, the brain EEG signals were collected through lightweight, wireless glasses with embedded electrodes near the eyes and scalp. The glasses had built-in audio where the robot could interact with the user. This design removed the need for messy electrode caps or any surgical procedure. It was also far more comfortable and natural to wear.
The system ran on a smartphone, using advanced algorithms to analyze and classify the brain signals in real time.
The glasses were paired with a Boston Dynamics robot dog called Spot. It is a four-legged robot developed by Boston Dynamics. Spot is mostly known from industrial and defense settings. It can carry heavy loads and map tunnels or construction sites. But Spot is also known for agility and can safely navigate complex indoor environments, making it ideal for indoor support tasks.
The user performed a mentally specific task, imagining solving a mathematical problem, or performed no mental task. The mental task - solving math - was selected based on its distinct and classifiable EEG pattern.
The mental calculation task activated unique patterns in the brain, which the software learned to recognize and decode into YES or NO. The command was then wirelessly transmitted to the robot operating system.
In the study, two healthy individuals with experience using brain-computer interfaces participated. Wearing the smart glasses, they answered yes-no questions mentally. When participants focused on solving math problems in their heads, the system interpreted those signals as a YES command.
Each answer triggered commands to Spot to perform a specific action, such as walking to a location, retrieving an item, or returning to base.
The robot dog Spot executed the tasks with over 80 percent accuracy in live trials.
Why do we need this technology?
Brain-controlled exoskeletons have helped people with paralysis regain the ability to walk short distances. Even lightweight robotic hands, made from 3D-printed materials, are now controlled using simple, non-invasive EEG systems.
The new tech offers hands-free, wireless, wearable brain-signal detection that uses thoughts to command a robot to fetch objects, follow them, or monitor surroundings.
For people with paralysis, ALS, spinal cord injuries, or amputations, all with limited mobility, everyday tasks can be impossible without assistance. The ability to guide a robot with the mind alone could profoundly enhance independence for people living with severe physical limitations.
The research team called their prototype system “Ddog”, a name combining digital and dog, to reflect the helper nature of the robot. It opens up new use cases for people who need assistance at home, in clinics, or in care settings, where bed-bound or severely disabled patients could gain autonomy by operating robots that assist with daily tasks or mobility.
About the scientific paper:
First author: Nataliya Kosmyna, USA
Published: Sensors, 2024
Link to paper: https://www.mdpi.com/1424-8220/24/1/80
Comments ()