Project OPZUID EmoRadar

Mentech Innovation applied for an EFRO OPZUID project grant. The grant was used to successfully develop the EmoRadar platform. In this project brief, the motivation, project objectives and project results are described.

The projects were financially supported by OPZUID and EFRO.

Introduction

People with dementia or mental disability are often not capable of expressing their emotional feelings and related unmet needs. Their communication skills are typically limited because of their mental disability or communication skills have been degraded in time. This leaves these vulnerable people often misunderstood and therefore not adequately treated and helped. An objective determination of stress or mental unbalance of these vulnerable people will provide improved quality of life and happiness, and will lead to reduced costs of care.

For instance, a client suffering from pain is not capable of expressing his discomfort to the care giver. The care giver experiences a bit unnatural behavior but does not directly relate this behavior to pain. If the client was wearing an emotion detection system, capable of measuring the presence or development of stress or pain via physiological parameters, the care giver could initiate actions to improve the client’s situation. This could be done by removing the source of the pain, distracting the client by activities (reading, sports, etc.) or by starting a medicine treatment.

People with dementia gradually lose their communication skills during the development of the condition. An objective determination of their mental wellbeing will help the care giver to provide the required care. Proper understanding of stress development could improve the social cohesion between clients and caregivers as well. Additonally, measurements over a longer period of time could give valuable insight in the development of dementia, making the emotion detector an analytics and reference tool.

Better understanding of the emotions of people with communication restrictions will strengthen the trust relation between the client and the caregiver, will increase happiness and quality of life and it will lead to better care. To address this societal problem, Mentech Innovation developed the emotion detection system HUME. This system is based on the detection of physiological parameters, such as heart rate, arousal, activity, skin temperature, skin conductance and voice recognition.

Emotions can be determined from physiological reaction (activation or arousal, for instance increases in heart rate), the change in activity in the autonomic nervous system (ANS), or biofeedback like blood pressure responses, skin responses, pupillary responses, brain waves, and heart responses. The emotion sensing capability is based on a wearable device that can capture physiological data (like heart rate, arousal, breathing rate, skin conductance and voice recognition) and can derive patterns and emotions via smart artificial intelligence algorithms.

The main objective of the EmoRadar project is to increase the quality of life and happiness of mentally disabled and demented people, in combination with the reduction of healthcare costs, by developing an emotion detection and pattern recognition toolkit with which the emotional well-being of this vulnerable target group can be determined.

The related goals were defined as:

  1.  The development of an emotion detection and pattern recognition toolkit (EmoRadar) for monitoring emotional well-being of people with intellectual disabilities and people with dementia, in order to provide better and cheaper care and increase the happiness of these vulnerable target groups in our society.
  2. The development of a software platform that can be used for a number of important market segments of Mentech Innovation, including E-health (emotion detection for mental health care, elderly care, etc.), E-commerce, Entertainment, Safety and security and IoT. The link with a device for active feedback and interaction also offers new opportunities for product development in other markets, such as serious games for increasing social coherence between users.
  3. Attention to innovation and innovation for a vulnerable target group in our society.
  4. Creating a cross-over innovation and knowledge network, between Mentech Innovation on the one hand, and healthcare institutions like Severinus on the other, for strengthening the innovative power of the region.

The intended results of the Emotional Radar project included:

  1. An advanced software platform for emotion detection and pattern recognition (EmoRadar), based on biofeedback and facial recognition for detecting and analyzing emotions.
  2. A working platform integrated in a test setup in which a number of sensors are built in and with which the principle of emotion detection and pattern recognition is demonstrated. This platform consists of a television display showing images with different emotion content for artificially generating emotions, emotion detection console for measuring the generated biofeedback signals, hardware for data acquisition and a computer running the software program.
  3. Validation in a test environment of the developed signal reduction and analysis methodology (EmoRadar). In this test environment, the software platform is tested by exposing a number of users to selected emotion content (by displaying video images) for a number of elementary emotions (sad, scared, enthusiastic, etc.). During these validation tests the body response is measured, analyzed and translated into emotion signals.
  4. Emotion content database, with results from number of reference cases, which are representative of the mentally handicapped and demented elderly care.

Project results

Figure 1: EmoRadar visualisation

The EmoRadar® is designed as a software platform that converts physiological data and human-based patterns mathematical models into emotions (Figure 1). The first set of specifications for the EmoRadar® technology development was derived from the mentally disabled healthcare application.

The EmoRadar analytics are based on real-time analysis of a number of bio-signals, analyzed via frequency and time-resolved analysis, auto- and cross-correlation functions and deep learning. The software determines a number of emotion content signals, the emotion radar, and uses meta data and reference cases for pattern recognition. In addition, a visualization tool for the real-time provision of emotions was designed using modern web technologies.

Figure 2: Model for emotion sensing

A model was made for identification of emotions and stress based on physiological measurements (Figure 2). The Plutchik’s emotion wheel was combined with the Valence-Arousal methodology. Stress is identified as a state of high arousal. The first target of the software platform is to identify stress via increased arousal. In a more advanced configuration, we will use the emotion radar to identify selective emotions from the measured physiological data.

To improve the sensitivity and resolving capability of emotion sensing and regulation, deep learning models and Artificial intelligence models, were developed. Models for enhanced emotion detection were based on redundancy and plurality of the body signals, to increase the reliability and selectivity of emotion detection. With smart algorithms, time-resolved information and correlations were derived from the data analysis and used to identify and classify different emotional states. These models were integrated in the software modules for data acquisition and analysis.

The software architecture including emotion detection models was integrated in the EmoKit hardware to deliver a first version of the emotion sensing kit, the HUME. The HUME was tested in a specifically designed set-up, in which sensor calibration and model validation were performed. During this validation, the user was exposed to video content of different emotional classifications to trigger a range of elementary emotions. During the course of the project, improved sensors and emotion detection models were integrated in the baseline to further improve the selectivity and sensitivity of emotion sensing.

Figure 3: Implementation of EmoRadar

The HUME determines from a number of measured emotion content signals the emotional status of a human (or animal) subject and creates from these emotion content signals an emotion content control signal. Based on this emotion content control signal, the emotional status of the person is visualized, via a traffic light display function. A first visualization of arousal detection is given on the Figure 3.

A perceived advantage of this device is insight in the objective quantification of the emotional status of persons, and is advantageous in case 1) need for care cannot verbally be expressed or seen from human interaction (like facial expressions or body language) or in case 2) the quantification of the emotional status of a person augments the feedback interaction of a person with its environment, for instance increased feedback interaction with a second device, like a computer, a game, etc.

Copyright © 2018-2019, Mentech Innovation