Robot avatar lets people see and feel things remotely through VR

by Admin
New Scientist Default Image

 

iCub3 is a humanoid avatar that people can embody remotely

Istituto Italiano di Tecnologia

A humanoid robot can relay video and touch sensations to a person wearing haptic feedback gloves and a virtual reality (VR) headset hundreds of kilometres away, offering a way for people to attend events without travelling.

The iCub 3 robot is a 52-kilogram, 125- centimetre-tall robot with 54 points of articulation across its aluminium alloy and plastic body. Its head contains two cameras where a human’s eyes would be, and an internet-connected computer where the brain would go. Along with the cameras, sensors covering its body send data to the robot’s “brain”. These sensations are then replicated on a suit and VR headset worn by a remote human operator.

New Scientist Default Image

A person wearing a VR headset and haptic feedback gloves can see and feel what the robot touches

Istituto Italiano di Tecnologia

When the operator reacts to what they see and feel, the suit’s sensors pick up the movements and the robot matches them. “The key is to translate every signal and bit of numeric data that can be sent through the network,” says Stefano Dafarra at the Italian Institute of Technology, who was part of the iCub 3 team. There can be a small delay of up to 100 milliseconds to capture and transmit the visual footage, but the operator can mitigate this by moving slightly slower than normal.

The team has demonstrated the robot at the Venice Biennale, where it wandered through an exhibition while its operator stood 290 kilometres away in Genoa.

Dafarra hopes people will use the iCub 3 to attend events remotely, reducing the need to travel. But at present, a fall could be hugely damaging to the robot, and it’s uncertain whether it could stand up again on its own, he says.

“iCub 3 is an interesting robot and offers clear advantages from the previous iteration,” says Jonathan Aitken at the University of Sheffield, UK, whose laboratory owns a prior version of the robot. However, he is disappointed that the team wasn’t clear in its research about the data transmission requirements of the new version of the robot. “It would be good to know just how much data was required, and what the upper and lower bounds were,” he says.

Topics:

Source Link

You may also like

Leave a Comment

This website uses cookies. By continuing to use this site, you accept our use of cookies.