Live processing pipeline

The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis.

live_processing_pipeline.json

For this use case we need to detect ArUco markers to enable gaze mapping: ArGaze provides the ArUcoCamera class to setup an ArUco markers pipeline.

{
    "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
        "name": "Camera",
        "size": [1920, 1080],
        "aruco_detector": {
            "dictionary": "DICT_APRILTAG_16h5",
            "optic_parameters": "optic_parameters.json",
            "parameters": "detector_parameters.json"
        },
        "gaze_movement_identifier": {
            "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
                "deviation_max_threshold": 25,
                "duration_min_threshold": 150
            }
        },
        "filter_in_progress_identification": false,
        "scenes": {
            "Cockpit": {
                "aruco_markers_group": "aruco_scene.obj",
                "layers": {
                    "Main" : {
                        "aoi_scene": "Cockpit.obj"
                    }
                },
                "frames": {
                    "PIC_PFD": {
                        "size": [960, 1080],
                        "background": "PIC_PFD.png",
                        "layers": {
                            "Main": {
                                "aoi_scene": "PIC_PFD.svg"
                            }
                        },
                        "image_parameters": {
                            "background_weight": 1,
                            "draw_gaze_positions": {
                                "color": [0, 255, 255],
                                "size": 15
                            }
                        }
                    }
                }
            }
        },
        "layers": {
            "Main": {
                "aoi_matcher": {
                    "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
                        "coverage_threshold": 0.25
                    }
                }        
            }
        },
        "image_parameters": {
            "background_weight": 1,
            "draw_gaze_positions": {
                "color": [0, 255, 255],
                "size": 4
            },
            "draw_detected_markers": {
                "color": [0, 255, 0],
                "draw_axes": {
                    "thickness": 4
                }
            },
            "draw_fixations": {
                "deviation_circle_color": [255, 127, 255],
                "duration_border_color": [127, 0, 127],
                "duration_factor": 1e-2
            },
            "draw_layers": {
                "Main": {
                    "draw_aoi_scene": {
                        "draw_aoi": {
                            "color": [0, 255, 255],
                            "border_size": 1
                        }
                    },
                    "draw_aoi_matching": {
                        "update_looked_aoi": true,
                        "draw_matched_fixation": {
                            "deviation_circle_color": [255, 255, 255],
                            "draw_positions": {
                                "position_color": [0, 255, 0],
                                "line_color": [0, 0, 0]
                            }
                        },
                        "draw_matched_region": {
                            "color": [0, 255, 0],
                            "border_size": 4
                        },
                        "draw_looked_aoi": {
                            "color": [0, 255, 0],
                            "border_size": 2
                        },
                        "looked_aoi_name_color": [255, 255, 255],
                        "looked_aoi_name_offset": [0, -10]
                    }
                }
            }
        },
        "observers": {
            "observers.ArUcoCameraLogger": {},
            "argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
                "path": "_export/look_performance.csv"
            },
            "argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
                "path": "_export/watch_performance.csv"
            }
        }
    }
}

All the files mentioned above are described below.

The ArUcoCameraLogger observer object is defined into the observers.py file that is described in the next chapter.

optic_parameters.json

This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into the camera calibration chapter.

{
    "rms": 0.6688921504088245,
    "dimensions": [
        1920,
        1080
    ],
    "K": [
        [
            1135.6524381415752,
            0.0,
            956.0685325355497
        ],
        [
            0.0,
            1135.9272506869524,
            560.059099810324
        ],
        [
            0.0,
            0.0,
            1.0
        ]
    ],
    "D": [
        0.01655492265003404,
        0.1985524264972037,
        0.002129965902489484,
        -0.0019528582922179365,
        -0.5792910353639452
    ]
}

detector_parameters.json

This file defines the ArUco detector parameters as explained into the detection improvement chapter.

{
    "adaptiveThreshConstant": 7,
    "useAruco3Detection": true
}

aruco_scene.obj

This file defines the place where are the ArUco markers into the cockpit geometry. Markers' positions have been edited in Blender software from a 3D scan of the cockpit then exported at OBJ format.

# Blender v3.0.1 OBJ File: 'scene.blend'
# www.blender.org
o DICT_APRILTAG_16h5#2_Marker
v -2.300000 18.573788 -49.271420
v 2.700000 18.573788 -49.271420
v -2.300000 23.028820 -51.541370
v 2.700000 23.028820 -51.541370
s off
f 1 2 4 3
o DICT_APRILTAG_16h5#3_Marker
v 37.993317 9.909389 -42.172752
v 42.993317 9.909389 -42.172752
v 37.993317 14.364422 -44.442703
v 42.993317 14.364422 -44.442703
s off
f 5 6 8 7
o DICT_APRILTAG_16h5#11_Marker
v -27.600000 29.075905 -51.042164
v -24.400000 29.075905 -51.042164
v -27.600000 31.927124 -52.494930
v -24.400000 31.927124 -52.494930
s off
f 9 10 12 11
o DICT_APRILTAG_16h5#14_Marker
v -27.280746 14.890414 -43.814297
v -24.080746 14.890414 -43.814297
v -27.280746 17.741634 -45.267063
v -24.080746 17.741634 -45.267063
s off
f 13 14 16 15
o DICT_APRILTAG_16h5#21_Marker
v 8.939880 28.459042 -50.445347
v 12.139881 28.459042 -50.445347
v 8.939880 31.310265 -51.898113
v 12.139881 31.310265 -51.898113
s off
f 17 18 20 19
o DICT_APRILTAG_16h5#22_Marker
v 8.939880 21.949581 -47.128613
v 12.139881 21.949581 -47.128613
v 8.939880 24.800800 -48.581379
v 12.139881 24.800800 -48.581379
s off
f 21 22 24 23
o DICT_APRILTAG_16h5#13_Marker
v -12.126360 14.872046 -43.804939
v -8.926359 14.872046 -43.804939
v -12.126360 17.723267 -45.257706
v -8.926359 17.723267 -45.257706
s off
f 25 26 28 27
o DICT_APRILTAG_16h5#12_Marker
v -43.079227 14.890414 -43.814297
v -39.879230 14.890414 -43.814297
v -43.079227 17.741634 -45.267063
v -39.879230 17.741634 -45.267063
s off
f 29 30 32 31

Cockpit.obj

This file defines the place of the AOI into the cockpit geometry. AOI positions have been edited in Blender software from a 3D scan of the cockpit then exported at OBJ format.

# Blender v3.0.1 OBJ File: 'scene.blend'
# www.blender.org
o PIC_PFD
v -43.208000 32.020378 -52.542446
v -26.000000 32.020378 -52.542446
v -43.208000 14.779404 -43.757732
v -26.000000 14.779404 -43.757732
s off
f 3 4 2 1
o ECAM_Engine_Fuel_Flaps
v 8.657453 16.194618 -44.196308
v 27.672760 16.055838 -44.125595
v 8.657453 31.527327 -52.008713
v 27.672760 31.441055 -51.964756
s off
f 5 6 8 7
o AP_ATHR_Plan.033
v 16.653587 46.982643 -32.403645
v 21.580402 46.974689 -32.399593
v 16.653587 52.562916 -35.246937
v 21.580402 52.554958 -35.242882
s off
f 9 10 12 11
o Exterior_Left
v -69.756531 46.523575 -40.193161
v 18.876167 46.523575 -55.821495
v -69.756531 87.247131 -40.193161
v 18.876167 87.247131 -55.821495
s off
f 13 14 16 15

PIC_PFD.png

This file is a screenshot of the PFD screen used to monitor where the gaze is projected after gaze mapping processing.

PFD frame background

PIC_PFD.svg

This file defines the place of the AOI into the PFD frame. AOI positions have been edited with Inkscape software from a screenshot of the PFD screen then exported at SVG format.

<svg>
    <rect id="PIC_PFD_Air_Speed" x="93.228" y="193.217" width="135.445" height="571.812"/>
    <rect id="PIC_PFD_Altitude" x="686.079" y="193.217" width="133.834" height="571.812"/>
    <rect id="PIC_PFD_FMA_Mode" x="93.228" y="85.231" width="772.943" height="107.986"/>
    <rect id="PIC_PFD_Heading" x="228.673" y="765.029" width="480.462" height="139.255"/>
    <rect id="PIC_PFD_Attitude" x="228.673" y="193.217" width="457.406" height="571.812"/>
    <rect id="PIC_PFD_Vertical_Speed" x="819.913" y="193.217" width="85.185" height="609.09"/>
</svg>

look_performance.csv

This file contains the logs of ArUcoCamera.look method execution info. It is saved into an _export folder from where the load command is launched.

On a Jetson Xavier computer, the look method execution time is 5.7ms and it is called ~100 times per second.

watch_performance.csv

This file contains the logs of ArUcoCamera.watch method execution info. It is saved into an _export folder from where the load command is launched.

On a Jetson Xavier computer with CUDA acceleration, the watch method execution time is 46.5ms and it is called more than 12 times per second.