The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

01/25/2021
by   J. P. Rodríguez-Gómez, et al.
13

The development of automatic perception systems and techniques for bio-inspired flapping-wing robots is severely hampered by the high technical complexity of these platforms and the installation of onboard sensors and electronics. Besides, flapping-wing robot perception suffers from high vibration levels and abrupt movements during flight, which cause motion blur and strong changes in lighting conditions. This paper presents a perception dataset for bird-scale flapping-wing robots as a tool to help alleviate the aforementioned problems. The presented data include measurements from onboard sensors widely used in aerial robotics and suitable to deal with the perception challenges of flapping-wing robots, such as an event camera, a conventional camera, and two Inertial Measurement Units (IMUs), as well as ground truth measurements from a laser tracker or a motion capture system. A total of 21 datasets of different types of flights were collected in three different scenarios (one indoor and two outdoor). To the best of the authors' knowledge this is the first dataset for flapping-wing robot perception.

READ FULL TEXT

page 1

page 3

page 5

page 6

research
10/03/2018

The Blackbird Dataset: A large-scale dataset for UAV perception in aggressive flight

The Blackbird unmanned aerial vehicle (UAV) dataset is a large-scale, ag...
research
09/11/2023

A Comparison between Frame-based and Event-based Cameras for Flapping-Wing Robot Perception

Perception systems for ornithopters face severe challenges. The harsh vi...
research
07/15/2021

CMU-GPR Dataset: Ground Penetrating Radar Dataset for Robot Localization and Mapping

There has been exciting recent progress in using radar as a sensor for r...
research
02/01/2022

MoCap-less Quantitative Evaluation of Ego-Pose Estimation Without Ground Truth Measurements

The emergence of data-driven approaches for control and planning in robo...
research
04/04/2023

USTC FLICAR: A Multisensor Fusion Dataset of LiDAR-Inertial-Camera for Heavy-duty Autonomous Aerial Work Robots

In this paper, we present the USTC FLICAR Dataset, which is dedicated to...
research
02/01/2022

NTU VIRAL: A Visual-Inertial-Ranging-Lidar Dataset, From an Aerial Vehicle Viewpoint

In recent years, autonomous robots have become ubiquitous in research an...
research
02/22/2019

Acting Is Seeing: Navigating Tight Space Using Flapping Wings

Wings of flying animals not only can generate lift and control torque bu...

Please sign up or login with your details

Forgot password? Click here to reset