Estimation from Indirect Supervision with Linear Moments

08/10/2016
by   Aditi Raghunathan, et al.
0

In structured prediction problems where we have indirect supervision of the output, maximum marginal likelihood faces two computational obstacles: non-convexity of the objective and intractability of even a single gradient computation. In this paper, we bypass both obstacles for a class of what we call linear indirectly-supervised problems. Our approach is simple: we solve a linear system to estimate sufficient statistics of the model, which we then use to estimate parameters via convex optimization. We analyze the statistical properties of our approach and show empirically that it is effective in two settings: learning with local privacy constraints and learning from low-cost count-based annotations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2019

Learning from Indirect Observations

Weakly-supervised learning is a paradigm for alleviating the scarcity of...
research
04/27/2023

Convexity Not Required: Estimation of Smooth Moment Condition Models

Generalized and Simulated Method of Moments are often used to estimate s...
research
12/16/2013

Probable convexity and its application to Correlated Topic Models

Non-convex optimization problems often arise from probabilistic modeling...
research
06/15/2020

Learnability with Indirect Supervision Signals

Learning from indirect supervision signals is important in real-world AI...
research
09/12/2023

On Robust Recovery of Signals from Indirect Observations

Our focus is on robust recovery algorithms in statistical linear inverse...
research
12/21/2022

Can NLI Provide Proper Indirect Supervision for Low-resource Biomedical Relation Extraction?

Two key obstacles in biomedical relation extraction (RE) are the scarcit...
research
07/16/2022

A Statistical Approach to Broken Stick Problems

Let a stick be broken at random at n-1 points to form n pieces. We consi...

Please sign up or login with your details

Forgot password? Click here to reset