Brain-controlled assistive robots hold the promise of restoring autonomy to paralyzed patients.
Existing approaches are based on low-level, continuous control of robotic devices, resulting
in a high cognitive load for their users. In the NeuroBots project, in contrast, we enhance prosthetic
devices with a certain degree of autonomy and adaptivity to enable control on a higher cognitive level.
To achieve this, we develop new methods and technologies in core areas of brain-machine interfaces, as
well as artificial intelligence. This includes innovative approaches to brain-signal decoding with deep
neural networks, efficient motion planning and improved perception for mobile robots and manipulators,
novel methods for deep reinforcement learning, hierarchical planning with user feedback, and evaluation
of formal methods for safety guarantees. The different components are continually integrated in an architecture
based on the Robot Operating System (ROS), realizing a demonstrator of the BrainLinks-BrainTools LiNC concept.
Research Status
The most outstanding result of the project is a fully integrated system that realizes the BrainLinks-BrainTools
LiNC concept, combining state-of-the-art online decoding of neuronal control signals with deep neural networks,
high-level hierarchical planning with graphical user interface based on the planner's world knowledge, as well as
novel perception and low-level robot planning algorithms for mobile robots. Improved high-level brain-signal decoding
and closed-loop human-robot interaction are in the focus of current research.