SharinGAN: Combining Synthetic and Real Data for Unsupervised Geometry Estimation

06/07/2020
by   Koutilya PNVR, et al.
0

We propose a novel method for combining synthetic and real images when training networks to determine geometric information from a single image. We suggest a method for mapping both image types into a single, shared domain. This is connected to a primary network for end-to-end training. Ideally, this results in images from two domains that present shared information to the primary network. Our experiments demonstrate significant improvements over the state-of-the-art in two important domains, surface normal estimation of human faces and monocular depth estimation for outdoor scenes, both in an unsupervised setting.

READ FULL TEXT

page 5

page 6

page 7

page 8

page 11

page 12

page 13

research
04/03/2019

Geometry-Aware Symmetric Domain Adaptation for Monocular Depth Estimation

Supervised depth estimation has achieved high accuracy due to the advanc...
research
08/04/2018

T2Net: Synthetic-to-Realistic Translation for Solving Single-Image Depth Estimation Tasks

Current methods for single-image depth estimation use training datasets ...
research
03/28/2023

4K-HAZE: A Dehazing Benchmark with 4K Resolution Hazy and Haze-Free Images

Currently, mobile and IoT devices are in dire need of a series of method...
research
06/04/2021

Self-Supervised Learning of Domain Invariant Features for Depth Estimation

We tackle the problem of unsupervised synthetic-to-realistic domain adap...
research
03/29/2021

Adaptive Surface Normal Constraint for Depth Estimation

We present a novel method for single image depth estimation using surfac...
research
07/14/2018

3D Hand Pose Estimation using Simulation and Partial-Supervision with a Shared Latent Space

Tremendous amounts of expensive annotated data are a vital ingredient fo...
research
01/26/2023

Learning Good Features to Transfer Across Tasks and Domains

Availability of labelled data is the major obstacle to the deployment of...

Please sign up or login with your details

Forgot password? Click here to reset