Dictionary Learning with BLOTLESS Update

06/24/2019
by   Qi Yu, et al.
0

Algorithms for learning a dictionary under which a data in a given set have sparse expansions typically alternate between sparse coding and dictionary update stages. Methods for dictionary update aim to minimise expansion error by updating dictionary vectors and expansion coefficients given patterns of non-zero coefficients obtained in the sparse coding stage. We propose a block total least squares (BLOTLESS) algorithm for dictionary update. BLOTLESS updates a block of dictionary elements and the corresponding sparse coefficients simultaneously. In the error free case, three necessary conditions for exact recovery are identified. Lower bounds on the number of training data are established so that the necessary conditions hold with high probability. Numerical simulations show that the bounds well approximate the number of training data needed for exact dictionary recovery. Numerical experiments further demonstrate several benefits of dictionary learning with BLOTLESS update compared with state-of-the-art algorithms especially when the amount of training data is small.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2021

Dictionary Learning with Convex Update (ROMD)

Dictionary learning aims to find a dictionary under which the training d...
research
10/25/2021

Dictionary Learning Using Rank-One Atomic Decomposition (ROAD)

Dictionary learning aims at seeking a dictionary under which the trainin...
research
08/04/2017

Correlation and Class Based Block Formation for Improved Structured Dictionary Learning

In recent years, the creation of block-structured dictionary has attract...
research
05/19/2015

Multi-task additive models with shared transfer functions based on dictionary learning

Additive models form a widely popular class of regression models which r...
research
09/09/2015

Dictionary Learning and Sparse Coding for Third-order Super-symmetric Tensors

Super-symmetric tensors - a higher-order extension of scatter matrices -...
research
01/22/2017

Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

In this paper, we focus on online representation learning in non-station...
research
02/18/2012

On the Sample Complexity of Predictive Sparse Coding

The goal of predictive sparse coding is to learn a representation of exa...

Please sign up or login with your details

Forgot password? Click here to reset