Energyefficient Amortized Inference with Cascaded Deep Classifiers
This topic contains 0 replies, has 1 voice, and was last updated by arXiv 1 year, 4 months ago.

Energyefficient Amortized Inference with Cascaded Deep Classifiers
Deep neural networks have been remarkable successful in various AI tasks but often cast high computation and energy cost for energyconstrained applications such as mobile sensing. We address this problem by proposing a novel framework that optimizes the prediction accuracy and energy cost simultaneously, thus enabling effective costaccuracy tradeoff at test time. In our framework, each data instance is pushed into a cascade of deep neural networks with increasing sizes, and a selection module is used to sequentially determine when a sufficiently accurate classifier can be used for this data instance. The cascade of neural networks and the selection module are jointly trained in an endtoend fashion by the REINFORCE algorithm to optimize a tradeoff between the computational cost and the predictive accuracy. Our method is able to simultaneously improve the accuracy and efficiency by learning to assign easy instances to fast yet sufficiently accurate classifiers to save computation and energy cost, while assigning harder instances to deeper and more powerful classifiers to ensure satisfiable accuracy. With extensive experiments on several image classification datasets using cascaded ResNet classifiers, we demonstrate that our method outperforms the standard welltrained ResNets in accuracy but only requires less than 20% and 50% FLOPs cost on the CIFAR10/100 datasets and 66% on the ImageNet dataset, respectively.
Energyefficient Amortized Inference with Cascaded Deep Classifiers
by Jiaqi Guan, Yang Liu, Qiang Liu, Jian Peng
https://arxiv.org/pdf/1710.03368v1.pdf
You must be logged in to reply to this topic.