Angular Learning: Toward Discriminative Embedded Features

12/17/2019
by   JT Wu, et al.
0

The margin-based softmax loss functions greatly enhance intra-class compactness and perform well on the tasks of face recognition and object classification. Outperformance, however, depends on the careful hyperparameter selection. Moreover, the hard angle restriction also increases the risk of overfitting. In this paper, angular loss suggested by maximizing the angular gradient to promote intra-class compactness avoids overfitting. Besides, our method has only one adjustable constant for intra-class compactness control. We define three metrics to measure inter-class separability and intra-class compactness. In experiments, we test our method, as well as other methods, on many well-known datasets. Experimental results reveal that our method has the superiority of accuracy improvement, discriminative information, and time-consumption.

READ FULL TEXT
research
01/29/2018

CosFace: Large Margin Cosine Loss for Deep Face Recognition

Face recognition has achieved revolutionary advancement owing to the adv...
research
05/24/2022

SFace: Sigmoid-Constrained Hypersphere Loss for Robust Face Recognition

Deep face recognition has achieved great success due to large-scale trai...
research
07/07/2021

IntraLoss: Further Margin via Gradient-Enhancing Term for Deep Face Recognition

Existing classification-based face recognition methods have achieved rem...
research
06/23/2022

Learning Towards the Largest Margins

One of the main challenges for feature representation in deep learning-b...
research
04/08/2019

G-softmax: Improving Intra-class Compactness and Inter-class Separability of Features

Intra-class compactness and inter-class separability are crucial indicat...
research
11/30/2018

Virtual Class Enhanced Discriminative Embedding Learning

Recently, learning discriminative features to improve the recognition pe...
research
04/25/2019

Learning Discriminative Features Via Weights-biased Softmax Loss

Loss functions play a key role in training superior deep neural networks...

Please sign up or login with your details

Forgot password? Click here to reset