New research helps robots combine language and gestures to find objects in cluttered spaces, improving how they understand human intent.
It is not hard to find popular depictions of robots in the future. What comes to my mind is Rosie the robot maid from the animated comic “The Jetsons.” Ever faithful and dependable, she is as much a ...
By incorporating insights from canine companions, researchers enable robots to use both language and gesture as inputs to help fetch the right objects.
POMDP, an AI framework inspired by dogs that allows robots to use human gestures and language to find objects with 89% accuracy.
Basic Human–Robot Interaction by Dr David O Johnson offers practical guidance and insights for conducting experiments in Human–Robot Interaction (HRI) and publishing the results in scientific journals ...
Social by nature, humans interact in multiple ways—through voice, vision and touch. Reflecting these humanistic qualities, robotic capabilities are improving, and as such, human-robot interaction will ...
Asa Whitney Professor of Mechanical Engineering Mark Yim has completed the second phase of a human-robot interaction research project at Penn’s ModLab. The project primarily aims to develop Quori, a ...
And a new study, published in the journal Emotion, suggests that a robot that mimics human breathing can also pass on frightened feelings. The researchers developed round fluffy robots with motorized ...
Researchers at Kyoto University have developed Buddharoid. This is an AI-powered humanoid robot designed to act like a Buddhist priest, amid Japan’s priest shortage.
For decades, humanoid robots have lived behind safety cages in factories or deep inside research labs. Fauna Robotics, a New York-based robotics startup, says that era is ending. The company has ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results