Evaluating Fairness in the Presence of Spatial Autocorrelation

01/05/2021
by   Cheryl Flynn, et al.
0

Fairness considerations for spatial data often get confounded by the underlying spatial autocorrelation. We propose hypothesis testing methodology to detect the presence and strength of this effect, then mitigate it using a spatial filtering-based approach – in order to enable application of existing bias detection metrics.

READ FULL TEXT

page 1

page 2

research
09/24/2018

Evaluating Fairness Metrics in the Presence of Dataset Bias

Data-driven algorithms play a large role in decision making across a var...
research
07/10/2020

Evaluating Fairness Using Permutation Tests

Machine learning models are central to people's lives and impact society...
research
05/19/2022

What Is Fairness? Implications For FairML

A growing body of literature in fairness-aware ML (fairML) aspires to mi...
research
06/29/2017

New Fairness Metrics for Recommendation that Embrace Differences

We study fairness in collaborative-filtering recommender systems, which ...
research
10/08/2021

Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks

Algorithmic bias is of increasing concern, both to the research communit...
research
05/24/2017

Beyond Parity: Fairness Objectives for Collaborative Filtering

We study fairness in collaborative-filtering recommender systems, which ...
research
06/16/2018

A nonparametric spatial test to identify factors that shape a microbiome

The advent of high-throughput sequencing technologies has made data from...

Please sign up or login with your details

Forgot password? Click here to reset