Learning Dependency Structures for Weak Supervision Models

03/14/2019
by   Paroma Varma, et al.
8

Labeling training data is a key bottleneck in the modern machine learning pipeline. Recent weak supervision approaches combine labels from multiple noisy sources by estimating their accuracies without access to ground truth labels; however, estimating the dependencies among these sources is a critical challenge. We focus on a robust PCA-based algorithm for learning these dependency structures, establish improved theoretical recovery rates, and outperform existing methods on various real-world tasks. Under certain conditions, we show that the amount of unlabeled data needed can scale sublinearly or even logarithmically with the number of sources m, improving over previous efforts that ignore the sparsity pattern in the dependency structure and scale linearly in m. We provide an information-theoretic lower bound on the minimum sample complexity of the weak supervision setting. Our method outperforms weak supervision approaches that assume conditionally-independent sources by up to 4.64 F1 points and previous structure learning approaches by up to 4.41 F1 points on real-world relation extraction and image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/21/2019

Multi-Resolution Weak Supervision for Sequential Data

Since manually labeling training data is slow and expensive, recent indu...
research
09/07/2017

Inferring Generative Model Structure with Static Analysis

Obtaining enough labeled data to robustly train complex discriminative m...
research
03/02/2017

Learning the Structure of Generative Models without Labeled Data

Curating labeled training data has become the primary bottleneck in mach...
research
05/11/2022

Weak Supervision with Incremental Source Accuracy Estimation

Motivated by the desire to generate labels for real-time data we develop...
research
09/23/2021

WRENCH: A Comprehensive Benchmark for Weak Supervision

Recent Weak Supervision (WS) approaches have had widespread success in e...
research
12/12/2012

Continuation Methods for Mixing Heterogenous Sources

A number of modern learning tasks involve estimation from heterogeneous ...
research
07/05/2021

End-to-End Weak Supervision

Aggregating multiple sources of weak supervision (WS) can ease the data-...

Please sign up or login with your details

Forgot password? Click here to reset