Back to top

Adaptation

Eye Interaction

EYE trackINg sysTEms foR incApaCity robusTness In aerONautics
Eye Interaction

Although aviation is one of the safest means of transportation, it appears that maximum safety and performance can only be obtained in the presence of pilot interfaces that are coherent with the cognitive system, so-called “neuropsychologically compatible” interfaces. The EYE-INTERACTION project aims to make progress in the development of products to combat incapacitation, a state during which an individual’s cognitive faculties are temporarily impaired, due, for example, to significant psychological stress associated with the presence of a dangerous situation. In this perspective, the project is based on the results of the ANR ASTRID NEUROERGO, which allowed the identification of physiological markers of incapacitation, notably based on the combination of functional Near Infrared Spectroscopy (fNIRS) and heart rate measurement. The project main objectives are:

  • propose a set of standardized incapacitation tasks and cognitive tests battery
  • identify incapacitation from a psychophysiological point of view
  • develop an ocular replay tool

Eye-Interaction partners are : ISAE-SUPAERO, INUC (Albi University), Safetyn, CEAM and ENAC



This project has received funding from ANR (Agence Nationale de la Recherche) as part of PROGRAMME ACCOMPAGNEMENT SPECIFIQUE DES TRAVAUX DE RECHERCHES ET D’INNOVATION DEFENSE MATURATION

Eye Interaction

STRESS

Human Performance neurometrics toolbox for highly automated systems design
STRESS

To support the transition to higher automation levels in aviation, by addressing, analysing and mitigating its impact on the Human Performance aspects associated to the future role of Air Traffic Controllers


Project website


This project has received funding from the SESAR Joint Undertaking under grant agreement No 699381 under European Union’s Horizon 2020 research and innovation programme

STRESS

NINA

Neurometrics Indicators for ATM
NINA

Can the ATCOs’ cognitive states be described and classified by monitoring relevant neurometric and neurophysiological parameters? Can this classification be used for the design of adaptive interfaces?

Project website

NINA