Parallel Neural Local Lossless Compression

01/13/2022
by   Mingtian Zhang, et al.
5

The recently proposed Neural Local Lossless Compression (NeLLoC), which is based on a local autoregressive model, has achieved state-of-the-art (SOTA) out-of-distribution (OOD) generalization performance in the image compression task. In addition to the encouragement of OOD generalization, the local model also allows parallel inference in the decoding stage. In this paper, we propose a parallelization scheme for local autoregressive models. We discuss the practicalities of implementing this scheme, and provide experimental evidence of significant gains in compression runtime compared to the previous, non-parallel implementation.

READ FULL TEXT
research
02/18/2023

Multistage Spatial Context Models for Learned Image Compression

Recent state-of-the-art Learned Image Compression methods feature spatia...
research
09/04/2021

On the Out-of-distribution Generalization of Probabilistic Image Modelling

Out-of-distribution (OOD) detection and lossless compression constitute ...
research
08/30/2022

Learned Lossless Image Compression With Combined Autoregressive Models And Attention Modules

Lossless image compression is an essential research field in image compr...
research
03/29/2021

Checkerboard Context Model for Efficient Learned Image Compression

For learned image compression, the autoregressive context model is prove...
research
09/13/2019

Local Decode and Update for Big Data Compression

This paper investigates data compression that simultaneously allows loca...
research
08/24/2021

Lossless Image Compression Using a Multi-Scale Progressive Statistical Model

Lossless image compression is an important technique for image storage a...
research
06/02/2016

Ensemble-Compression: A New Method for Parallel Training of Deep Neural Networks

Parallelization framework has become a necessity to speed up the trainin...

Please sign up or login with your details

Forgot password? Click here to reset