Back to search

FRINATEK-Fri prosj.st. mat.,naturv.,tek

Virtual-Eye - Learning from human eye scanpaths for optimal autonomous search

Alternative title: Virtual-Eye

Awarded: NOK 11.8 mill.

Novel search algorithms inspired by eye-gaze trajectories admit a list of potential applications in autonomous robotic search where the search time is critical such as target detection in surveillance applications and visual scan for self-driving cars. Mathematical research tends to forego that the eyes are intrinsically linked with the cognitive process. Here, eye movements are often treated as a simple way to forage for visual information. That is, however, a crude simplification: human scan-paths reflect more than just a strategy to forage for visual information and they seem to be more complex than the aforementioned intermittent processes and other searching strategies such as Lévy flights. Both these mathematical processes are seemingly universal in explaining the foraging behaviour in processes as diverse as the hunting movement of albatrosses and sharks, the movement of swimming bacteria or the exploration of the Walt Disney Resort by children. Yet, paradoxically, the debate is still ongoing what its general form actually is. Our hypothesis in this "Virtual-Eye" project is that some of the added complexity of gaze trajectories, when compared to Lévy flights or intermittent processes, are designed to optimize the search for visual information. Thus, the ability to model, understand and mimic the intrinsic dynamics of human eye-gaze trajectories holds a huge promise to revolutionise the field of autonomous search. This will mark a new era in the research of autonomous search which has been dominated by searching algorithms based on Lévy flights and which are crude simplification of human eye movement and are not optimal in real-life settings. In order to model gaze trajectories, we will follow two main families of approaches: stochastic modelling and Artificial Intelligence (AI) based modelling.

Novel search algorithms inspired by eye-gaze trajectories admit a list of potential applications in autonomous robotic search where the search time is critical such as target detection in surveillance applications and visual scan for self-driving cars. Mathematical research tends to forego that the eyes are intrinsically linked with the cognitive process. Here, eye movements are often treated as a simple way to forage for visual information. That is, however, a crude simplification: human scan-paths reflect more than just a strategy to forage for visual information and they seem to be more complex than the aforementioned intermittent processes and other searching strategies such as L'evy flights. Both these mathematical processes are seemingly universal in explaining the foraging behaviour in processes as diverse as the hunting movement of albatrosses and sharks, the movement of swimming bacteria or the exploration of the Walt Disney Resort by children. Yet, paradoxically, the debate is still ongoing what its general form actually is. Our hypothesis in this "Virtual-Eye" project is that some of the added complexity of gaze trajectories, when compared to L'evy flights or intermittent processes, are designed to optimize the search for visual information. Thus, the ability to model, understand and mimic the intrinsic dynamics of human eye-gaze trajectories holds a huge promise to revolutionise the field of autonomous search. This will mark a new era in the research of autonomous search which has been dominated by searching algorithms based on L'evy flights and which are crude simplification of human eye movement and are not optimal in real-life settings. In order to model gaze trajectories, we will follow tow main families of approaches: stochastic modelling and Artificial Intelligence (AI) based modelling.

Funding scheme:

FRINATEK-Fri prosj.st. mat.,naturv.,tek