Low-viewpoint forest depth dataset for sparse rover swarms

03/09/2020
by   Chaoyue Niu, et al.
0

Rapid progress in embedded computing hardware increasingly enables on-board image processing on small robots. This development opens the path to replacing costly sensors with sophisticated computer vision techniques. A case in point is the prediction of scene depth information from a monocular camera for autonomous navigation. Motivated by the aim to develop a robot swarm suitable for sensing, monitoring, and search applications in forests, we have collected a set of RGB images and corresponding depth maps. Over 100k images were recorded with a custom rig from the perspective of a small ground rover moving through a forest. Taken under different weather and lighting conditions, the images include scenes with grass, bushes, standing and fallen trees, tree branches, leafs, and dirt. In addition GPS, IMU, and wheel encoder data was recorded. From the calibrated, synchronized, aligned and timestamped frames about 9700 image-depth map pairs were selected for sharpness and variety. We provide this dataset to the community to fill a need identified in our own research and hope it will accelerate progress in robots navigating the challenging forest environment. This paper describes our custom hardware and methodology to collect the data, subsequent processing and quality of the data, and how to access it.

READ FULL TEXT

page 2

page 3

page 4

research
12/05/2020

Depth estimation on embedded computers for robot swarms in forest

Robot swarms to date are not prepared for autonomous navigation such as ...
research
04/03/2023

FinnWoodlands Dataset

While the availability of large and diverse datasets has contributed to ...
research
11/02/2022

Deep Learning Computer Vision Algorithms for Real-time UAVs On-board Camera Image Processing

This paper describes how advanced deep learning based computer vision al...
research
02/12/2023

Sensing and Navigation of Aerial Robot for Measuring Tree Location and Size in Forest Environment

This paper shows the achievement of a sensing and navigation system of a...
research
07/15/2021

CMU-GPR Dataset: Ground Penetrating Radar Dataset for Robot Localization and Mapping

There has been exciting recent progress in using radar as a sensor for r...
research
08/02/2018

Object Localization and Size Estimation from RGB-D Images

Depth sensing cameras (e.g., Kinect sensor, Tango phone) can acquire col...
research
07/03/2020

ODE-CNN: Omnidirectional Depth Extension Networks

Omnidirectional 360 camera proliferates rapidly for autonomous robots si...

Please sign up or login with your details

Forgot password? Click here to reset