Have you heard of biosensor technology?
Scientists at the University of Technology Sydney (UTS) have developed biosensor technology that allows you to control devices such as robots and machines through thought control alone. The advanced brain-computer interface was developed by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi from the UTS Faculty of Engineering and IT in collaboration with Australia’s Army and Defense Innovation Hub.
In addition to defense applications, this technology has significant potential in areas such as advanced manufacturing, aerospace, and healthcare—for example, enabling people with disabilities to operate a wheelchair or operate prosthetics.” Hands-free and voice technology work outside of a laboratory setting, anytime, anywhere. This makes interfaces such as consoles, keyboards, touchscreens and hand gesture recognition redundant,” said Professor Iacopi.
By using a high-end graphene material combined with silicon, we were able to overcome corrosion, durability and resistance to skin contact issues and develop wearable dry sensors,” she said.
A new study describing the technology has just been published in the peer-reviewed journal ACS Applied Nano Materials. It shows that the graphene sensors developed at UTS are highly conductive, easy to use and robust.
Hexagonal patterned sensors are placed over the back of the scalp to detect brain waves from the visual cortex. The sensors are resistant to harsh conditions, so they can be used in extreme operating environments.
Four-legged Ghost Robotics robot using a brain-machine interface
The user wears an augmented reality lens on their head that displays white flashing squares. By focusing on a specific square, the biosensor captures the operator’s brainwaves and the decoder converts the signal into commands. The technology was recently demonstrated by the Australian Army, where soldiers operated a four-legged Ghost Robotics robot using a brain-machine interface.
The device made it possible to control the robotic dog without the use of hands with an accuracy of up to 94%. Our technology can issue at least nine commands in two seconds. That means we have nine different kinds of commands, and the operator can choose one of those nine during that time period,” Professor Lin said. “We also explored how to minimize body and environmental noise to get a clearer signal from the operator. brain,” he said.
The researchers believe the technology will be of interest to the scientific community, industry and government, and hope to continue to advance brain-computer interface systems.
Read Now :<strong>China’s Baidu cancels live broadcast of product related to ChatGPT-like “Ernie bot”</strong>