Back to search


MP: SentiSystems - the brains of autonomous operation

Alternative title: SentiSystems - lillehjernen til autonom drift

Awarded: NOK 0.49 mill.

Project Number:


Project Period:

2020 - 2021

Funding received from:

SentiSystems is particularly useful for automatic, unmanned and autonomous platforms for vehicles such as moving robots, drones as well as unmanned underwater and surface vessels. Example use-cases are drone-based photogrammetry and airborne 3D digital laser-based mapping and modelling, robotic agriculture, autonomous cars, air- and ship-based hyperspectral image surveying, robot and machine vision integration tasks in addition to drone and underwater inspection. The NTNU innovation SentiBoard hardware together with the SentiFusion software enables highly accurate time synchronization of data from different sensors, such as imaging, motion, position, velocity from a variety of manufacturers with a micro- to millisecond accuracy. By assigning a specific time-of-validity to each sensor measurement and packing the measurements into a common format, SentiBoard mitigates the biggest error source for sensor fusion in fast moving and highly dynamic applications. In order to be able to trigger the full value of sensors with high resolution and accuracy, it is crucial to process and synchronize data of multi-sensor systems with very high accuracy and speed. Sentisystems delivers a homogeneous and comprehensive data set from multi-sensor systems as a starting point for autonomy intelligence.