Back to search

MAROFF-2-Maritim virksomhet og offsh-2

OceanEye - All-weather, high-precision intelligent payload for sea surface object detection

Alternative title: OceanEye - intelligent presisjonssensor for deteksjon av objekter på havoverflaten

Awarded: NOK 8.0 mill.

The OceanEye project has researched and developed sensor and data acquisition system for Unmanned Aircraft Systems (UAS) that gives the ability to find and identify small floating objects on the sea surface. The system enables systematic search, data capture and automatic data analysis from UAS in harsh Northern weather conditions. The system is primary motivated by an industrial need from the seismic industry, but the results from having a system that is able to find small objects floating on the sea-surface will have a wide range of market opportunities in applications such as search-and-rescue, ship surveillance/military, oil-spill detection and ice-management operations in the high north. Autonomous systems are outlined as a decisive capacity for monitoring and controlling the ocean space. From a flying drone, there are typically floating/sailing objects in the ocean that are of interest. In the search phase we would like the sensor to cover every area of the surface, but the big challenge is that almost all the imagery is unlikely to contain nothing but sea. If there is a 24-hour surveillance service it is an inhumane task to sift through such image material. In practice, low bandwidth SatCom at high prices (e.g., Iridium) will also prevent transmission of sensor data. We therefore need an autonomous sensor system that is economically sound to integrate on UAS and that itself detects specific objects, and prioritizes what should be sent back to the operator station. In areas where communication bandwidth is limited the UAS could benefit from a higher level of autonomy, and this sensor system could be used to guide the UAS into for example a circling pattern over a sea surface object of interest. In this project, Maritime Robotics, SINTEF, NTNU, NORCE and PGS have been working together on the technical design and prototyping of the UAS sensor system. In 2019, initial flights tests were done to start obtaining system requirements. Data from these flights was used in 2020 for initial explorations into machine learning methods for object detection at sea (Arnegaard et al, 2021). Early 2020, a workshop was held for the project, to bring together all project partners to discuss sensor selection and architecture. Thereafter, the design phase was taken further with creating documents describing the overall architecture and the plans for sensor configurations. A set of sensors was acquired, and combined with NTNU?s Sentiboard, to create a sensor pod for UAS. Maritime Robotics worked on putting together this new sensor pod, designing also hardware for mounting on UAS. In the meantime, SINTEF and NTNU worked on detection and classification methods, developing a first approach to object detection and representation of the data to operators. In 2021, field test preparation was done by all project partners, and Maritime Robotics executed more field tests, now with the new sensor pod. Initial data analysis was done by SINTEF, NTNU, and Maritime Robotics, in preparation of further analysis and testing. SINTEF also worked further on methods for presenting the results to UAS operators. In Summer 2021, MR and SINTEF also started developing anomaly detection approaches for sea-surface object detection. This work will be taken further for comparing different approaches in terms of detection performance and real-time processing performance. In preparation for field testing and on-board processing, NORCE is contributing with Cryocore, to enable effective strategies for sharing data from UAS to operator. NORCE have set up a dedicated ground segment (Cryorack) for communicating with the drones using Cryocore. A flexible solution to roll out multiple solutions has been developed using the Red Hat Ansible automation platform. During 2022 and 2023, further testing and development was done. Field testing was also done using simpler gimbal systems as payload, testing the use of off-the-shelf payloads instead of the custom developed payload. This enables the development of hardware and software to be separated, enabling the software to accept data from multiple payloads. This also enable the use of the processing pipeline and anomaly detection on more platforms, like other UAS and some USVs as well.

Prosjektet har bidratt til å forske frem teknologier som binder sensorteknologi, maskinlæring og dataoverføring som gjør det mulig å utvikle fremtidig bruk av systemer for situasjonsforståelse for flygende og flytende fartøy. Kunnskap fra prosjektet er allerede tatt i bruk i videreutvikling av produkter i Maritime Robotics portefølje.

This project will create a sensor and data acquisition system for Uncrewed Aircraft Systems (UAS) that gives the ability to find and identify small floating objects on the sea surface. The system will do systematic search, data capture and automatic data analysis from UAS in harsh Northern weather conditions. The system is primarily motivated by an industrial need from project partner PGS, but the results from having a system that is able to find small objects floating on the sea-surface will have a wide range of market opportunities in applications such as search and rescue, ship surveillance, military, oil spill detection and ice management operations in the high north. Autonomous systems are outlined as a decisive capacity for monitoring and controlling the ocean space. From a flying drone, there are typically floating or sailing objects in the ocean that are of interest. In the search phase we would like the sensor to cover every area of the surface, but the big challenge is that almost all the imagery is unlikely to contain anything but sea. If there is a 24-hour surveillance service it is an inhumane task to sift through such image material. In practice, low bandwidth SatCom at high prices (e.g. Iridium) will also prevent transmission of sensor data. We therefore need an autonomous sensor system that that is economically sound to integrate on UAS and that itself detects specific objects, and that prioritizes what should be sent back to the back station. In areas where communication bandwidth are limited the UAS could benefit from a higher level of autonomy, and this sensor system could be used to guide the UAS into, for example, a circling pattern over a sea surface object of interest.

Funding scheme:

MAROFF-2-Maritim virksomhet og offsh-2