Analysis of Human Expressive Behaviour – My Placement with BLUESKEYE AI

post by Iris Jestin (2023 cohort)

Introduction

I began my placement with BLUESKEYE AI over the summer of 2024, as a Human Factors Engineering Research Assistant, working as part of their Research and Development Team, guided by my industry supervisor Michel Valstar. Working on the placement felt like a good switch-up of going back into the industry after a year of being in academia and I was excited to return to the familiar fast pace of moving towards business goals and deliverables. Over the course of the placement, I was able to work on several projects to make valuable contributions and further my learning experience.

What The Company Does And Relevance To My PhD

BLUESKEYE AI is a spin-out from the University of Nottingham, based in the Sir Colin Campbell building on Jubilee Campus. They specialise in machine understanding of facial and eye behaviour using machine learning and computer vision technologies, to detect a user’s expressed emotional states. While they have different product offerings in the health and wellbeing space, I had the brilliant opportunity to work on their projects in the automotive space. Their product B-Automotive allowed different automotive customers to integrate their technology into vehicles to help with safe driving by detecting driver expressed emotional states that might be undesirable in a driving context. This felt specifically relevant to my PhD which explores advanced vehicle technologies, including driver state monitoring systems for the ageing population in future vehicles. What I looked forward to the most was perhaps the understanding I would gain regarding the industry application of the area my PhD explores. While it is not possible to get into specific details and findings of the projects I carried out as they are commercially sensitive, I hope to give a brief overview of my placement research activities and learnings in this reflection.

Research Activities

i) Familiarising with the company’s automotive product offering, SDK, and working – The first couple of weeks involved getting to know the company’s ways of working, and getting familiarised with their B-Auto software development kit (SDK) which uses analysis of facial expressions to estimate dynamic expressed emotional states. I soon learnt about the Facial Action Coding System (FACS) which is a comprehensive, anatomically based system for describing all visually understandable facial movement. It breaks down facial expressions into individual components of muscle movement, called Action Units (AUs). Facial Emotional Expressions (FEE) at the most basic level, is a combination of AUs. Each AU is a distinct muscle movement, and combinations of these units represent different expressed emotions. BLUESKEYE worked to accurately understand which AUs tend to co-occur, what the combinations signify, and how to label them accurately. This involves identifying combinations of active AUs that correspond to an expressed emotion which indicates a felt emotion.

ii) Familiarising with already existing data – During the first part of the placement, I had the opportunity to play around with their existing video data of drivers inside a car that captures facial behaviour, eye gaze behaviour, head movement and yawns. Each of the drivers had a driving video in the morning and in the evening after work. This involved me doing some early analysis using Python to explore trends in the behavioural data that may indicate fatigue, in the morning versus the evening. The behavioural data included eye blink rate, eye blink velocity, eye saccade velocity which is the speed with which the eye moves between different fixed points, head movement angle and head movement velocity. This phase mostly prompted me to get stuck back into Python after not having used it in a while, and helped me familiarise myself with the type of data that the company worked with. The analysis further helped me understand the importance of accounting for individualistic differences before coming to conclusions when working with behavioural data.

iii) Driving fatigue study – As part of a study that evaluates the BLUESKEYE technology which detects driving fatigue, I created the information sheet and consent form that was used for it. Further, I conducted a literature review that worked to define sleepiness and fatigue. This involved trying to identify from literature facial and eye behavioural and physiological indicators that indicate fatigue that may not have been explored by BLUESKEYE. This also included documenting the ways in which these indicators have been measured in literature and the average rate of these indicators when a person is fatigued. Keeping in line with my PhD’s area of focus, I was able to find literature that explored how some of these indicators of fatigue varied for an older versus a younger driver.

iv) Data analysis of in-house data – The last phase of the placement involved data analysis using Python on in-house data, to find trends within a specific parameter – the driver’s head turn, that indicates fatigue while driving. These were checked against a questionnaire that the driver filled to self-report their fatigue. Different types of head turn were identified from the video data. Then the speed and angle of the head turn were compared against the types of head turn for low versus high self-reported fatigue. This was used to identify patterns in the head turn data that would correspond to high or low fatigue that could potentially be built in for detection by the technology.

Conclusion And Next Steps For Continued Collaboration

This placement has been an invaluable experience as it gave me a snapshot of what working in the industry in the area of my PhD research would entail. It was decided due to the close alignment of the PhD with some of the company’s projects, to have meetings with the R&D team every couple months to identify any opportunity for continued collaboration. Further, it is expected that the experimental study of the PhD may potentially use BLUESKEYE’s technology in some capacity. I close with a special thank you to Mani and Adrian from BLUESKEYE’s R&D team without whom much of my learning would not have been possible.