De-biasing facial detection system using VAE

04/16/2022
by   Vedant V. Kandge, et al.
0

Bias in AI/ML-based systems is a ubiquitous problem and bias in AI/ML systems may negatively impact society. There are many reasons behind a system being biased. The bias can be due to the algorithm we are using for our problem or may be due to the dataset we are using, having some features over-represented in it. In the face detection system bias due to the dataset is majorly seen. Sometimes models learn only features that are over-represented in data and ignore rare features from data which results in being biased toward those over-represented features. In real life, these biased systems are dangerous to society. The proposed approach uses generative models which are best suited for learning underlying features(latent variables) from the dataset and by using these learned features models try to reduce the threats which are there due to bias in the system. With the help of an algorithm, the bias present in the dataset can be removed. And then we train models on two datasets and compare the results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset