Faster-than-fast NMF using random projections and Nesterov iterations

12/11/2018
by   Farouk Yahaya, et al.
0

Random projections have been recently implemented in Nonnegative Matrix Factorization (NMF) to speed-up the NMF computations, with a negligible loss of performance. In this paper, we investigate the effects of such projections when the NMF technique uses the fast Nesterov gradient descent (NeNMF). We experimentally show the randomized subspace iteration to significantly speed-up NeNMF.

READ FULL TEXT

page 1

page 2

page 3

research
11/10/2020

Gaussian Compression Stream: Principle and Preliminary Results

Random projections became popular tools to process big data. In particul...
research
06/25/2017

There and Back Again: A General Approach to Learning Sparse Models

We propose a simple and efficient approach to learning sparse models. Ou...
research
09/21/2017

Deep Recurrent NMF for Speech Separation by Unfolding Iterative Thresholding

In this paper, we propose a novel recurrent neural network architecture ...
research
05/18/2015

Compressed Nonnegative Matrix Factorization is Fast and Accurate

Nonnegative matrix factorization (NMF) has an established reputation as ...
research
05/19/2020

Two-Dimensional Semi-Nonnegative Matrix Factorization for Clustering

In this paper, we propose a new Semi-Nonnegative Matrix Factorization me...
research
10/18/2016

Fast L1-NMF for Multiple Parametric Model Estimation

In this work we introduce a comprehensive algorithmic pipeline for multi...
research
06/13/2023

Evaluating Bias and Noise Induced by the U.S. Census Bureau's Privacy Protection Methods

The United States Census Bureau faces a difficult trade-off between the ...

Please sign up or login with your details

Forgot password? Click here to reset