PixelSNAIL: An Improved Autoregressive Generative Model
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 9 months ago.

PixelSNAIL: An Improved Autoregressive Generative Model
Autoregressive generative models consistently achieve the best results in density estimation tasks involving high dimensional data, such as images or audio. They pose density estimation as a sequence modeling task, where a recurrent neural network (RNN) models the conditional distribution over the next element conditioned on all previous elements. In this paradigm, the bottleneck is the extent to which the RNN can model longrange dependencies, and the most successful approaches rely on causal convolutions, which offer better access to earlier parts of the sequence than conventional RNNs. Taking inspiration from recent work in meta reinforcement learning, where dealing with longrange dependencies is also essential, we introduce a new generative model architecture that combines causal convolutions with self attention. In this note, we describe the resulting model and present stateoftheart loglikelihood results on CIFAR10 (2.85 bits per dim) and $32 times 32$ ImageNet (3.80 bits per dim). Our implementation is available at https://github.com/neocxi/pixelsnailpublic
PixelSNAIL: An Improved Autoregressive Generative Model
by Xi Chen, Nikhil Mishra, Mostafa Rohaninejad, Pieter Abbeel
https://arxiv.org/pdf/1712.09763v1.pdf
You must be logged in to reply to this topic.