Correct-N-Contrast: A Contrastive Approach for Improving Robustness to Spurious Correlations

03/03/2022
∙
by   Michael Zhang, et al.
∙
3
∙

Spurious correlations pose a major challenge for robust machine learning. Models trained with empirical risk minimization (ERM) may learn to rely on correlations between class labels and spurious attributes, leading to poor performance on data groups without these correlations. This is particularly challenging to address when spurious attribute labels are unavailable. To improve worst-group performance on spuriously correlated data without training attribute labels, we propose Correct-N-Contrast (CNC), a contrastive approach to directly learn representations robust to spurious correlations. As ERM models can be good spurious attribute predictors, CNC works by (1) using a trained ERM model's outputs to identify samples with the same class but dissimilar spurious features, and (2) training a robust model with contrastive learning to learn similar representations for same-class samples. To support CNC, we introduce new connections between worst-group error and a representation alignment loss that CNC aims to minimize. We empirically observe that worst-group error closely tracks with alignment loss, and prove that the alignment loss over a class helps upper-bound the class's worst-group vs. average error gap. On popular benchmarks, CNC reduces alignment loss drastically, and achieves state-of-the-art worst-group accuracy by 3.6 absolute lift. CNC is also competitive with oracle methods that require group labels.

READ FULL TEXT

page 8

page 16

page 36

page 37

page 38

research
∙ 12/02/2022

Avoiding spurious correlations via logit correction

Empirical studies suggest that machine learning models trained with empi...
research
∙ 08/01/2023

Is Last Layer Re-Training Truly Sufficient for Robustness to Spurious Correlations?

Models trained with empirical risk minimization (ERM) are known to learn...
research
∙ 04/05/2022

Spread Spurious Attribute: Improving Worst-group Accuracy with Spurious Attribute Estimation

The paradigm of worst-group loss minimization has shown its promise in a...
research
∙ 01/10/2022

Towards Group Robustness in the presence of Partial Group Labels

Learning invariant representations is an important requirement when trai...
research
∙ 07/14/2022

Contrastive Adapters for Foundation Model Group Robustness

While large pretrained foundation models (FMs) have shown remarkable zer...
research
∙ 03/10/2023

Distributionally Robust Optimization with Probabilistic Group

Modern machine learning models may be susceptible to learning spurious c...
research
∙ 04/15/2022

Perfectly Balanced: Improving Transfer and Robustness of Supervised Contrastive Learning

An ideal learned representation should display transferability and robus...

Please sign up or login with your details

Forgot password? Click here to reset