Dictionary Learning with BLOTLESS Update

by   Qi Yu, et al.

Algorithms for learning a dictionary under which a data in a given set have sparse expansions typically alternate between sparse coding and dictionary update stages. Methods for dictionary update aim to minimise expansion error by updating dictionary vectors and expansion coefficients given patterns of non-zero coefficients obtained in the sparse coding stage. We propose a block total least squares (BLOTLESS) algorithm for dictionary update. BLOTLESS updates a block of dictionary elements and the corresponding sparse coefficients simultaneously. In the error free case, three necessary conditions for exact recovery are identified. Lower bounds on the number of training data are established so that the necessary conditions hold with high probability. Numerical simulations show that the bounds well approximate the number of training data needed for exact dictionary recovery. Numerical experiments further demonstrate several benefits of dictionary learning with BLOTLESS update compared with state-of-the-art algorithms especially when the amount of training data is small.


page 1

page 2

page 3

page 4


Dictionary Learning with Convex Update (ROMD)

Dictionary learning aims to find a dictionary under which the training d...

Dictionary Learning Using Rank-One Atomic Decomposition (ROAD)

Dictionary learning aims at seeking a dictionary under which the trainin...

Correlation and Class Based Block Formation for Improved Structured Dictionary Learning

In recent years, the creation of block-structured dictionary has attract...

Multi-task additive models with shared transfer functions based on dictionary learning

Additive models form a widely popular class of regression models which r...

Dictionary Learning and Sparse Coding for Third-order Super-symmetric Tensors

Super-symmetric tensors - a higher-order extension of scatter matrices -...

Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World

In this paper, we focus on online representation learning in non-station...

On the Sample Complexity of Predictive Sparse Coding

The goal of predictive sparse coding is to learn a representation of exa...

Please sign up or login with your details

Forgot password? Click here to reset