Brain waves control robot dog’s moves

While biological dogs respond to voice commands “sit”, “fetch” and “stay”, robotic dogs might soon take instruction via a person’s brain.

CAPTION: Sergeant Rana Chandan operates a Ghost Robotics quadruped robot using a novel brain-computer interface during a demonstration at Majura training area. Story and photo by Sergeant Matt Bickerton.

This technology was demonstrated when Sergeant Damian Robinson, from 5th Combat Service Support Battalion, and Sergeant Chandan Rana, from 1st/15th Royal New South Wales Lancers, commanded a robot to go to several locations using their powers of concentration at Majura Training Area, Canberra, on May 11.

Several white squares corresponding to waypoints flickered on Sergeant Robinson’s augmented reality lens at varying frequencies.

A graphene biosensor at the back of Sergeant Robinson’s head was ready to detect brainwaves from his visual cortex.

   

When Sergeant Robinson concentrated on a particular flicker, the biosensor detected corresponding brainwaves and signalled an amplification circuit.

An artificial intelligence decoder translated the signal into commands, which the robot dog then followed.

Sergeant Robinson and another soldier operated the technology using a commercial HoloLens running technology developed by University of Technology Sydney (UTS) researchers to command a Ghost Robotics quadruped robot.

“The whole process is not difficult to master. It’s very intuitive. It only took a couple of sessions,” Sergeant Robinson said.

CAPTION: Australian Army soldier Sergeant Rana Chandan (right) from the 1st/15th Royal New South Wales Lancers during a novel brain-computer interface demonstration at Majura Training Area, Canberra. Photo by Sergeant Matthew Bickerton.

The robot is typically controlled with a hand-held console, but in this case, the operator’s brainwave initiates the commands. This allows the operator to maintain weapon readiness or use their hands for other tasks.

Sergeant Robinson joined the program in April and did eight, two-hour sessions with the system.

During the demonstration, Sergeant Robinson could command the robot to visit six pre-determined locations based on one of six flickers that he could choose.

“You don’t have to think anything specific to operate the robot, but you do need to focus on that flicker,” he said.

“It’s more of a visual concentration thing.”

The purpose of the demonstration was to get soldiers thinking about how Army might integrate this technology into the tactical environment.

Researchers at UTS and Army’s Robotic and Autonomous Implementation and Coordination Office (RICO) worked together since December 2020 to explore brain-computer interfaces and their tactical applications.

This exploration was a four way collaboration between Defence Innovation Hub, RICO, UTS, and Defence Science and Technology Group.

Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi, from UTS, have made several breakthroughs in brain-computer interfaces.

Professor Lin figured out how to minimise noise from the body and environment to get a clearer signal from an operator’s brain.

Another advancement was increasing the number of commands the decoder can deliver in a fixed period.

“We have nine different kinds of commands and the operator can select one from those nine within that time period,” Professor Lin said.

Professor Iacopi developed a replacement for older biosensors used to detect brainwaves, overcoming issues of corrosion, durability and skin contact resistance through cutting-edge graphene material.

“We’ve been able to combine the best of graphene, which is very biocompatible and very conductive, with the best of silicon technology, which makes our biosensor very resilient and robust to use,” Professor Iacopi said.

Defence has provided $1.2 million in research funding to UTS through the Defence Innovation Hub.


 
.

    .  
...
...
. .
952 Total Views 6 Views Today

Leave a Reply

Your email address will not be published.