The FACE robot is capable of mimicking facial expressions. The basic expressions used for the generation of more complex facial states are inspired by the six basic expressions identified by Ekman (Anger, Disgust, Fear, Happiness, Sadness, Surprise).
Working on this direction we are trying to improve the FACE credibility focusing on a field which is largely explored in the Japanese robotic community under the umbrella of “the Uncanny valley” paradigm.
This video shows the robot during the preliminary test of the control library, where the basic facial expressions were tested and the face tracking algorithm was calibrated.
The human face is equipped with a complex physical structure and it has more than 100 muscles situated between skin surface and skull with very different shapes and functionality. They allow us to control even minimal muscular movements and to generate a myriad of different facial expressions.
The positions of the servo motors try to reflect the ones of the facial muscles to simulate expressions in a very realistic way. Servo motors actuates the skin along vectors corresponding to the natural muscle actions of the face, enabling animation control in an intuitive manner according to Ekman’s FACS (Facial Action Coding System). Paul Ekman developed the FACS as comprehensive language to code facial expressions in terms of atomic muscle movements, named facial action units (AUs). A single AU includes more than one muscle because the changes in appearance the muscles produced could not be distinguished. During the execution of an expression, the active AUs can be seen as estimates of the underlying muscle activations that caused the performed expression.
However there is an important difference in the way the human muscles and the robot servo motors actuate the face. In contrast to human muscles, the robot servos can produce only linear contractions. Human orbicular muscles, like the Orbicularis oculi and the Orbicularis oris, produce circular contractions. The movement of this kind of muscles is reproduced using more than one servo motor in the most realistic way as possible.
F.A.C.E. (Facial Automaton for Conveying Emotions) is a human android used in a structured therapeutic environment, at the moment for therapies with autistic children.
FACE consists of a passive articulated body equipped with a believable facial display system. The head of the robot is an artificial skull covered by a special skin made of FrubberTM, a particular material developed by Hanson Robotics. FrubberTM is a patented silicone elastomer containing up to 70% air by volume whose mechanical properties allow complex facial movements. The control of the size and distribution of the open and closed air cells in the FrubberTM skin is what allows it to move much like human skin. The fact that it can be moved by small servos with little force makes it useful for humanoid robot faces.
The actuating system of the robot is controlled by a SSC-32 serial servo controller with 32 servo motors which are all integrated in the android skull and neck.