Towards a Robust Aerial Cinematography Platform: Localizing and Tracking Moving Targets in Unstructured Environments
The use of drones for aerial cinematography has revolutionized several applications and industries requiring live and dynamic camera viewpoints such as entertainment, sports, and security. However, safely controlling a drone while filming a moving target usually requires multiple expert human operators; hence the need for an autonomous cinematographer. Current approaches have severe real-life limitations such as requiring scripted scenes that can be solved offline, high-precision motion-capture systems or GPS tags to localize targets, and prior maps of the environment to avoid obstacles and plan for occlusion. In this work, we overcome such limitations and propose a complete system for aerial cinematography that combines: (1) a visual pose estimation algorithm for target localization; (2) a real-time incremental 3D signed-distance map algorithm for occlusion and safety computation; and (3) a real-time camera motion planner that optimizes smoothness, collisions, occlusions and artistic guidelines. We evaluate robustness and real-time performance in series of field experiments and simulations by tracking dynamic targets moving through unknown, unstructured environments. Finally, we verify that despite removing previous limitations, our system still matches state-of-the-art performance. Videos of the system in action can be seen at https://youtu.be/ZE9MnCVmumc
READ FULL TEXT