Predictive Inequity in Object Detection

02/21/2019
by   Benjamin Wilson, et al.
0

In this work, we investigate whether state-of-the-art object detection systems have equitable predictive performance on pedestrians with different skin tones. This work is motivated by many recent examples of ML and vision systems displaying higher error rates for certain demographic groups than others. We annotate an existing large scale dataset which contains pedestrians, BDD100K, with Fitzpatrick skin tones in ranges [1-3] or [4-6]. We then provide an in-depth comparative analysis of performance between these two skin tone groupings, finding that neither time of day nor occlusion explain this behavior, suggesting this disparity is not merely the result of pedestrians in the 4-6 range appearing in more difficult scenes for detection. We investigate to what extent time of day, occlusion, and reweighting the supervised loss during training affect this predictive bias.

READ FULL TEXT
research
06/07/2023

ICON^2: Reliably Benchmarking Predictive Inequity in Object Detection

As computer vision systems are being increasingly deployed at scale in h...
research
02/06/2022

Detecting Melanoma Fairly: Skin Tone Detection and Debiasing for Skin Lesion Classification

Convolutional Neural Networks have demonstrated human-level performance ...
research
05/24/2019

Uncertainty Estimation in One-Stage Object Detection

Environment perception is the task for intelligent vehicles on which all...
research
01/21/2021

Occlusion Handling in Generic Object Detection: A Review

The significant power of deep learning networks has led to enormous deve...
research
04/16/2023

Handling Heavy Occlusion in Dense Crowd Tracking by Focusing on the Heads

With the rapid development of deep learning, object detection and tracki...
research
03/11/2021

Privacy-preserving Object Detection

Privacy considerations and bias in datasets are quickly becoming high-pr...

Please sign up or login with your details

Forgot password? Click here to reset