Class-Imbalanced Complementary-Label Learning via Weighted Loss

by   Meng Wei, et al.
China University of Mining and Technology

Complementary-label learning (CLL) is a common application in the scenario of weak supervision. However, in real-world datasets, CLL encounters class-imbalanced training samples, where the quantity of samples of one class is significantly lower than those of other classes. Unfortunately, existing CLL approaches have yet to explore the problem of class-imbalanced samples, which reduces the prediction accuracy, especially in imbalanced classes. In this paper, we propose a novel problem setting to allow learning from class-imbalanced complementarily labeled samples for multi-class classification. Accordingly, to deal with this novel problem, we propose a new CLL approach, called Weighted Complementary-Label Learning (WCLL). The proposed method models a weighted empirical risk minimization loss by utilizing the class-imbalanced complementarily labeled information, which is also applicable to multi-class imbalanced training samples. Furthermore, the estimation error bound of the proposed method was derived to provide a theoretical guarantee. Finally, we do extensive experiments on widely-used benchmark datasets to validate the superiority of our method by comparing it with existing state-of-the-art methods.


page 1

page 2

page 3

page 4


Learning from Stochastic Labels

Annotating multi-class instances is a crucial task in the field of machi...

Latent Hinge-Minimax Risk Minimization for Inference from a Small Number of Training Samples

Deep Learning (DL) methods show very good performance when trained on la...

Novelty Detection and Learning from Extremely Weak Supervision

In this paper we offer a method and algorithm, which make possible fully...

Deep Reinforcement Learning for Multi-class Imbalanced Training

With the rapid growth of memory and computing power, datasets are becomi...

EC3: Combining Clustering and Classification for Ensemble Learning

Classification and clustering algorithms have been proved to be successf...

A Novel Adaptive Minority Oversampling Technique for Improved Classification in Data Imbalanced Scenarios

Imbalance in the proportion of training samples belonging to different c...

Please sign up or login with your details

Forgot password? Click here to reset