You are required to read and agree to the below before accessing a full-text version of an article in the IDE article repository.

The full-text document you are about to access is subject to national and international copyright laws. In most cases (but not necessarily all) the consequence is that personal use is allowed given that the copyright owner is duly acknowledged and respected. All other use (typically) require an explicit permission (often in writing) by the copyright owner.

For the reports in this repository we specifically note that

  • the use of articles under IEEE copyright is governed by the IEEE copyright policy (available at http://www.ieee.org/web/publications/rights/copyrightpolicy.html)
  • the use of articles under ACM copyright is governed by the ACM copyright policy (available at http://www.acm.org/pubs/copyright_policy/)
  • technical reports and other articles issued by M‰lardalen University is free for personal use. For other use, the explicit consent of the authors is required
  • in other cases, please contact the copyright owner for detailed information

By accepting I agree to acknowledge and respect the rights of the copyright owner of the document I am about to access.

If you are in doubt, feel free to contact webmaster@ide.mdh.se

Enhancing Drone Surveillance with NeRF: Real-World Applications and Simulated Environments

Fulltext:


Authors:

Joakim Lindén, Giovanni Burresi , Håkan Forsberg, Masoud Daneshtalab, Ingemar Söderquist

Publication Type:

Conference/Workshop Paper

Venue:

43rd Digital Avionics Systems Conference (DASC)


Abstract

Machine Learning (ML) systems require representative and diverse datasets to accurately learn the objective task. In supervised learning data needs to be accurately annotated, which is an expensive and error-prone process. We present a method for generating synthetic data tailored to the usecase achieving excellent performance in a real-world usecase. We provide a method for producing automatically annotated synthetic visual data of multirotor unmanned aerial vehicle (UAV) and other airborne objects in a simulated environment with a high degree of scene diversity, from collection of 3D models to generation of annotated synthetic datasets (synthsets). In our data generation framework SynRender we introduce a novel method of using Neural Radiance Field (NeRF) to capture photorealistic high-fidelity 3D-models of multirotor UAVs in order to automate data generation for an object detection task in diverse environments. By producing data tailored to the real world setting, our NeRF-derived results show an advantage over generic 3D asset collection-based methods where the domain gap between the simulated and real-world is unacceptably large. In the spirit of keeping research open and accessible to the research community we release our dataset VISER DroneDiversity used in this project, where visual images, annotated boxes, instance segmentation and depth maps are all generated for each image sample.

Bibtex

@inproceedings{Linden6972,
author = {Joakim Lind{\'e}n and Giovanni Burresi and H{\aa}kan Forsberg and Masoud Daneshtalab and Ingemar S{\"o}derquist},
title = {Enhancing Drone Surveillance with NeRF: Real-World Applications and Simulated Environments},
month = {October},
year = {2024},
booktitle = {43rd Digital Avionics Systems Conference (DASC)},
url = {http://www.es.mdu.se/publications/6972-}
}