Real time head-mounted eye tracking interactions

ArGaze enabled a cognitive assistant to support a pilot's situation awareness.

The following use case has integrated the ArUco marker pipeline to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the gaze analysis pipeline.

Background

The HAIKU project aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events. One of the features provided by the assistant through ArGaze is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters. When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM).

SA alert

Environment

Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2). The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer. ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.

SimOne cockpit

The ArUco marker pipeline has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while gaze analysis pipeline was identifying fixations and matching them with dynamic AOIs related to each instruments. To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.

ArUco markers and AOI scene

Finally, fixation events were sent in real-time through Ivy bus middleware to the situation awareness software in charge of displaying attention getter onto the PFD screen.

Setup

The setup to integrate ArGaze to the experiment is defined by 3 main files detailled in the next chapters:

As any ArGaze setup, it is loaded by executing the load command:

python -m argaze load live_streaming_context.json

This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping. Another window is also opened to display gaze mapping onto PFD screen.

ArGaze load GUI for Haiku