“Talking Ally”, a robot made by researchers at Toyohashi University of Technology’s Interactions and Communication Design lab, can follow a human’s gaze and respond accordingly. For example, it can make a fuss if the person stops talking to it:
The robot’s small body looks a bit like a stylized version of the Pixar lamp, a pedestal with a bendy column and an oblong face. In the middle of the face sits a single unblinking eye, so that Talking Ally can maintain eye contact with its person. A second, tucked-away camera tracks the human’s face. Servomotors and springs give the robot a surprising range of movement.
Depending on where the human looked, Talking Ally could engage them in conversation about their focus. If the human was watching sports, Talking Ally could respond with sports news pulled from an RSS feed. If the human didn’t pay attention, the robot would try to make the human notice it. If the human turned to look at and talk to Talking Ally, the robot could nod its head vigorously to show it was engaged in the conversation.
Reading and demonstrating body language will help robots better understand and communicate with humans in the future. This is probably a good thing, even if it means robots will become more annoying.