Class Mean Vectors, Self Monitoring and Self Learning for Neural Classifiers

10/22/2019
by   Eugene Wong, et al.
0

In this paper we explore the role of sample mean in building a neural network for classification. This role is surprisingly extensive and includes: direct computation of weights without training, performance monitoring for samples without known classification, and self-training for unlabeled data. Experimental computation on a CIFAR-10 data set provides promising empirical evidence on the efficacy of a simple and widely applicable approach to some difficult problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/20/2023

Towards Understanding How Self-training Tolerates Data Backdoor Poisoning

Recent studies on backdoor attacks in model training have shown that pol...
research
12/21/2020

Out-distribution aware Self-training in an Open World Setting

Deep Learning heavily depends on large labeled datasets which limits fur...
research
01/21/2022

How does unlabeled data improve generalization in self-training? A one-hidden-layer theoretical analysis

Self-training, a semi-supervised learning algorithm, leverages a large a...
research
09/30/2019

Revisiting Self-Training for Neural Sequence Generation

Self-training is one of the earliest and simplest semi-supervised method...
research
11/04/2021

Building Damage Mapping with Self-PositiveUnlabeled Learning

Humanitarian organizations must have fast and reliable data to respond t...
research
05/31/2019

Unlabeled Data Improves Adversarial Robustness

We demonstrate, theoretically and empirically, that adversarial robustne...
research
06/18/2021

A Unified Generative Adversarial Network Training via Self-Labeling and Self-Attention

We propose a novel GAN training scheme that can handle any level of labe...

Please sign up or login with your details

Forgot password? Click here to reset