The first version of the HUME measures stress and stress build-up in real time by using two wearables and validated models.
We develop and train the models with competent test persons. In this setting we allow test subjects to experience positive and negative emotions (through VR, video, sound, pain) and annotate these. We then capture these physiological response with the HUME.
We validate the models in a national pilot study together with care partners.
The HUME is a cloud-based streaming platform that reads the wearables in real-time and translates the measured physiology with trained and validated models into a stress indication. This interpretation is 1) represented as a time-dependent emotion radar with which the healthcare professional can see the stress build-up and 2) implemented as an early warning system, in the form of an acoustic or visual message (traffic light)
Future versions of the HUME also use sound and face recognition to selectively identify emotions and built-in sensors to measure physiological and personal characteristics.