Once only possible in science fiction, brain-to-machine interfaces are now a reality. Last year, Synchron implanted its first brain-computer interface in a US patient. Likewise, the head of Twitter, Elon Musk, is developing Neuralink, a brain chip that aims to aid people with paralysis. Now, researchers at the University of Technology Sydney have created a non-invasive brain-to-machine interface that allows the user to control robots through their thoughts.
“The hands-free, voice-free technology works outside laboratory settings, anytime, anywhere. It makes interfaces such as consoles, keyboards, touchscreens and hand-gesture recognition redundant,” UTS professor and co-author of the recent paper Francesca Iacopi told Hackster. “By using cutting edge graphene material, combined with silicon, we were able to overcome issues of corrosion, durability and skin contact resistance to develop the wearable dry sensors.”
Users view the available commands on a Hololens 2 MR display, and then simply concentrate on the command they wish to utilise. Members of the Australian army are already testing the device to control a Ghost Robotics quadrupedal land drone with an accuracy rate of 94 per cent.
Putting a hex on users
This sensor is not the first brain-machine interface, but where rival systems have been invasive, requiring surgery to implement, the UTS prototype is placed against the user’s scalp on a hexagonal grid. Despite being non-invasive, the device can withstand harsh weather conditions.
“Our technology can issue at least nine commands in two seconds. This means we have nine different kinds of commands and the operator can select one from those nine within that time period,” adds professor and co-author Chin-Teng. “We have also explored how to minimise noise from the body and environment to get a clearer signal from an operator’s brain.”
Following the military tests, Sergeant Damien Robinson said, “The potential of the project is actually very broad. At its core, it’s translating brainwaves into zeroes and ones and that could be implemented into a number of different systems. It just happens in this particular instance we are translating it into control for a robot.”