Machine Learning

Tensor Regression Networks with various Low-Rank Tensor Approximations

Tagged: ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 1 year ago.


  • arXiv
    5 pts

    Tensor Regression Networks with various Low-Rank Tensor Approximations

    Tensor regression networks achieve high rate of compression of model parameters in multilayer perceptrons (MLP) while having slight impact on performances. Tensor regression layer imposes low-rank constraints on the tensor regression layer which replaces the flattening operation of traditional MLP. We investigate tensor regression networks using various low-rank tensor approximations, aiming to leverage the multi-modal structure of high dimensional data by enforcing efficient low-rank constraints. We provide a theoretical analysis giving insights on the choice of the rank parameters. We evaluated performance of proposed model with state-of-the-art deep convolutional models. For CIFAR-10 dataset, we achieved the compression rate of 0.018 with the sacrifice of accuracy less than 1%.

    Tensor Regression Networks with various Low-Rank Tensor Approximations
    by Xingwei Cao, Guillaume Rabusseau, Joelle Pineau
    https://arxiv.org/pdf/1712.09520v1.pdf

You must be logged in to reply to this topic.