The goal of this project is to develop next generation smart perception sensors and enhance the distributed intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions for the health, wellbeing, and automotive domains.
This project resorts under the ECSEL joint undertaking and is co-founded by the EU H2020 programme under grant agreement 876487 and national funding agencies in Belgium, Czech Republic, Finland, Germany, Italy, The NEtherlands and Spain.
The NextPerception project intends to make a leap beyond the current state of the art of sensing and to achieve a higher level of services based on information obtained observing people and their environment. We envision that this leap entails and comprises paradigm shifts at the following three conceptual levels of sensor system technology development:
Smart Perception Sensors: Advanced Radar, LiDAR and Time of Flight (ToF) sensors form the eyes and ears of the system. They will be specifically enhanced in this project to observe human behaviour and health parameters and will be combined with complementary sensors to provide all the information needed in the use cases. Their smart features enable easy integration as part of a distributed intelligent system.
Distributed Intelligence: this emerging paradigm allows for the distribution of the analytics and decision making processes across the system to optimise efficiency, performance and reliability of the system. We will develop smart sensors with embedded intelligence and facilities for communicating and synchronising with other sensors and provide the most important missing toolsets, i.e., those for programming networks of embedded systems, for explainable AI and for distributed decision making.
Proactive Behaviour and Physiological Monitoring: smart sensors and distributed intelligence will be applied to understand human behaviour and provide desired predictive and supportive functionality in the contexts of interest, while preserving users’ privacy.
-Sensing of human behavior
-Accurate and unobtrusive sensing of human behavior and physiological parameters by means of innovative perception and complementary sensors
-Predictive analytics and explainable AI
-Support proactive decision making ensuring Health, Wellbeing, and Traffic Safety by means of predictive analytics and explainable AI
-Provide a reference platform
-Provide a reference platform to support the design, implementation and management of distributed sensing and intelligence solutions
-Demonstration and validation of monitoring solutions
-Demonstration and validation of proactive monitoring solutions in Health and Wellbeing and Automotive remains, including cross-sector applications
This use case plans to investigate and define smart solutions intended to mitigate the risk of accidents and their consequences on individuals and, indirectly, on the society (e.g. reduction of costs for fatal and disabling injuries due to road crash). In particular, this use case (UC) focuses on the definition of innovative sensing and identifying solutions to monitor driver status and behaviour. In addition, the emotional state of the driver/user will be addressed in the specific application scenario for this UC, which considers private vehicles.
All these applications require video and audio acquisition and processing to detect relevant information as well as data collection. Moreover, distributed low power sensors can be used to improve the accuracy of the detection as well as to overcome the issues of an only video system.
The main goal is to develop a Driver Monitoring System (DMS), which can classify both the driver’s cognitive states and the driver’s emotional states, as well as the activities and positions of occupants (including driver) inside the vehicle cockpit. Examples of cognitivestates are distraction (including all forms), fatigue, workload and drowsiness; while examples of emotions can include anxiety, panic attack, anger/aggressiveness, and so on.
In order to achieve that, several sensors and source of information will be considered:
Image, audio processing and computer vision: Such as Eye-tracking, face and expression recognition, etc.
Physiological and biometric driver signals : Such as heart rate, blood pressure, ECG/EEG, skin conductance and so on.
Vital signs monitoring:Such as speed, yaw-rate, steering wheel, accelerator and braking behavior, etc. acquired by ad hoc sensors and by data already available from vehicle (CAN-BUS)
Traffic data:Such as obstacles speed and velocity, position of ego-vehicle in the lane, and so forth, possibly supplied by a simulator.
Next project
If you are interested in collaborating with us or if you would like information about our services, please contact us and we will be happy to help. Let’s get in touch and make something great happen.