Home Bots & Brains Research: sensor for ‘mind-controlled’ robots

Research: sensor for ‘mind-controlled’ robots

by Pieter Werner

Controlling a robot with your mind may sound like something straight out of a science fiction movie, but recent research published in ACS Applied Nano Materials has taken a significant step toward making it a reality. The study describes a unique, 3D-patterned structure that doesn’t require sticky conductive gels and can measure the brain’s electrical activity, even amidst hair and the bumps and curves of the head.

Currently, physicians monitor electrical signals from the brain using electroencephalography (EEG), where electrodes are either implanted into or placed on the surface of the head. EEG is instrumental in diagnosing neurological disorders and can also be incorporated into “brain-machine interfaces,” where brain waves are used to control external devices such as prosthetic limbs, robots, or video games. However, the most commonly used non-invasive sensors are “wet” sensors, which require sticky gels that can irritate the scalp and sometimes trigger allergic reactions.

In contrast, researchers have been developing “dry” sensors that do not require gels, but none of them have worked as well as the gold-standard wet variety. Although nanomaterials like graphene could be a suitable option, their flat and typically flaky nature makes them incompatible with the uneven curves of the human head, particularly over long periods.

Francesca Iacopi and colleagues wanted to create a 3D, graphene-based sensor based on polycrystalline graphene that could accurately monitor brain activity without any stickiness. The team created several 3D graphene-coated structures with different shapes and patterns, each around 10 µm thick. Of the shapes tested, a hexagonal pattern worked the best on the curvy, hairy surface of the occipital region — the spot at the base of the head where the brain’s visual cortex is located.

The team incorporated eight of these sensors into an elastic headband, which held them against the back of the head. When combined with an augmented reality headset displaying visual cues, the electrodes could detect which cue was being viewed and work with a computer to interpret the signals into commands that controlled the motion of a four-legged robot completely hands-free.

While the new electrodes didn’t work quite as well as the wet sensors yet, the researchers believe that their work represents a critical first step towards developing robust, easily-implemented dry sensors to help expand the applications of brain-machine interfaces.

This research has the potential to revolutionize the field of neuroscience, especially in the development of brain-computer interfaces that allow people to control external devices using their thoughts. It could also have significant implications for the development of prosthetic limbs that respond to the user’s thoughts, allowing them to move with greater ease and precision. The researchers hope that their work will eventually lead to the development of dry sensors that are even more effective than the current wet sensors.

Photo: ACS Applied Nano Materials

Misschien vind je deze berichten ook interessant