Remember when everyone thought that in the 21st century people would be living and working side-by-side with robots? Well, that day is rapidly approaching. Robots are already working closely with humans; vacuuming our homes, teaching our children, and even performing surgery. In an effort to streamline human-robot communication, engineers have been looking for ways to make robots appear more human by creating more realistic skin, hair, and eyes. One of the often overlooked aspects of robot behavior is body language, but researchers are working hard to remedy that.
Without realizing it, we continually send subtle signals to one another while we talk. The catalog of nonverbal communication is vast and varied; it includes vocal tonality, gestures, facial expressions, posture, eye contact, and more. Robotics experts are tasked with, not only replicating this voluminous repertoire of nonverbal cues, but also programming robots with the information necessary to read human signals and to respond with the correct cues at the appropriate times. Recently, researchers have taken the first tentative steps down the difficult road to seamless human-robot nonverbal communication.
Bilge Mutlu, associate professor of computer sciences, psychology, and industrial and systems engineering at University of Wisconsin-Madison, is a pioneer in human-robot communication. Mutlu created a kind of "robot 20 questions" experiment to tell if nonverbal cues could help humans understand a robot's intentions. In the experiment, 2 groups of participants were tasked with discovering which of the objects on a desk a life-like robot had selected by asking it a series of questions. When answering, the robot sent surreptitious eye flashes indicating the object of interest to one group, but not the other. The group that received subtle nonverbal cues discovered the object of interest after significantly fewer questions than the control group. Furthermore, the eye flashes didn't even consciously register for three-quarters of the experimental group.
More recently, Mutlu has been working with NASA and General Motors to develop a robot to work alongside astronauts. "Robonaut2" is outfitted with a variety of skin sensors to detect tactile cues and cameras to form a three-dimensional picture of its surroundings. Although it's still in the prototype stages, Robonaut2 is able to track human head movements and communicate with astronauts nonverbally by tilting its head to indicate where its attention is focused (something we do without thinking). The goal of the Robonaut2 project is to produce a humanoid robot that can work in harmony with its fellow human team members.
Researchers have already shown that rudimentary robot nonverbal communication enhances robot-human work efficiency, but what about human-human work efficiency? You may have already been "programmed" with plenty of nonverbal communication skills, but are you always using them to your advantage? Are you sometimes sending mixed signals? Are you accurately reading the signs from your co-workers? Examining how difficult it is to construct a full body language toolset and the benefits of having them on hand reveals just how important body language is to your professional life. Make sure you're using all of the nonverbal communication skills in your tool box or soon you could wind up replaced by a robot!