+39 050 2217050

The Acquisition Platform

HIPOP is an innovative acquisition platform developed to collect data from the subject’s involved in emotional and social interaction tasks.

HIPOP has been designed to be modular and configurable. It is based on Robotics4.NET, a framework for robotic programming whose purpose is to provide a robust communication infrastructure between software modules, called roblets, and the body map which represents the main module of the platform. Each roblet executes a particular feature and can be disconnected and re-connected automatically when it crashes without compromising the whole system. On the other hand, if the body map crashes all the roblets can continue their task collecting, storing and visualizing data. When the body map will be active again the exchange of messages between the body map and the roblets will restart. Each module is an autonomous task but can also be controlled by the Body map.

At the moment HIPOP is used as acquisition system during Robot based Autistic Therapy (see the therapy description article) and  three types of roblets have been developed and integrated in the platform:

  • Physio roblet. It is a roblet developed for collecting data using a wearable physiological signal acquisition system. It consists of a graphical user interface which allows the user to configure the Bluetooth connection with the electronic device of the sensorized t-shirt and display subject’s vital signals and charts. The subject performing is constantly monitored through a sensorized t-shirt, an unobtrusive garment based on e-textiles with sensors fully integrated into the fabric structure which gathers, computes and transmit electrocardiogram (ECG) and respiratory signal (Resp) also used to obtain additional signals such as heart rate (HR), heart rate variability (HRV), breathing rate (BR) and breathing amplitude (BA). All these parameters are known to be bodily correlated of emotional states. The wearable electronic unit is also equipped with a 3D accelerometer.
  • Video Roblet. Psychologists need to analyse and interpret the children’s reactions to the changes of the robot even off-line. Each VideoRoblet runs a video recorder program which allows capturing video and audio streams from a variety of different input devices. The video recorder program allows the user to fully configure the video and audio devices and preview the video stream of the cameras during the recording and take snapshots from the capture device during the recording. The GUI has been developed using the C# programming language and it is based on Microsoft DirectShow, an architecture on the Microsoft Windows platform, and DirectShow.NET, a library allowing access to Microsoft DirectShow functionality from within .NET applications.
  • EDA Roblet. Electrodermal Activity describes changes in the skin’s ability to conduct electricity and it is linked to changes of hydration in the sweat glands which are controlled by the sympathetic nervous system. EDA signal gives an indication of the psychological or physiological arousal of the patient. EDA roblet communicates with an XBee ADC module integrated into a dedicated wearable unit which allows acquiring EDA, body temperature and battery level with a variable sample rate dependent by the XBee signal strength and packet loss.
  • FACE Roblet. The software program run by the FACE roblet allows the user to control the robot FACE. The software program is based on a C# library developed for controlling the servo motors responsible for designing facial expressions of the android. Through the GUI the user can decide which expression has to be performed by the robot and control single movements like the blink of eyes, the turn and tilt of the neck or the different parts of the face.

Future roblets will be developed and integrated with the platform:

  • EyeTracker Roblet. The Eye Tracker system consists of a child-sized cap or head band with a brim, on which a small rectangular mirror is fixed directed towards the wearer’s eyes. An opening in the brim directs the reaction from the mirror to a small video camera attached to the top of the cap. Thus the direction of the pupils with respect to the subject’s head is constantly monitored and recorded. Furthermore a 3 axis inertial platform maintains information on the orientation of the head.
  • Operator Roblets.  A series of roblets allow therapists and assistants to control different aspects of the therapy. These roblets include:
    o    Therapist GUI. A User Interface developed for a tablet device allows therapists to take the control of the robot during the therapy. Using a portable device, therapists are able to manually select the behaviour of the robot including expressions, micro movements of the neck and eyes and separated movements of each motors.
    o    Operator GUI. The second control of the robot is given to an operator managing the behaviour of the robot through a set of parameters which will be included in the algorithm that will decided the  final “mood” to be sent the robot. The GUI allows the operator to command the robot to perform entire expressions or single movements like eyes directions, neck movements or lip motions through dedicated software controls.
    o    Assistant GUI. An additional GUI is developed for therapists’ assistants whom follow therapies outside of the room. They look at therapies from another point of view compared to the one of the psychologists performing the therapy and they may need to annotate additional details or particular notes.
  • WIIRemote roblet. A WII Remote, with a combination of built-in accelerometers and infrared detection, allows sensing its position in 3D space when pointed at the LEDs within the Sensor Bar. It can be used for controlling the robot through one of the user interface described above. WIIRemote roblet monitors and logs the current state and all activities of the remote controller.