Anytime Recognition with Routing Convolutional Networks.

Clicks: 262
ID: 74900
2019
Article Quality & Performance Metrics
Overall Quality Improving Quality
0.0 /100
Combines engagement data with AI-assessed academic quality
AI Quality Assessment
Not analyzed
Abstract
An automatic trade-off between accuracy and efficiency for a single deep neural network is highly desired in time-sensitive computer vision applications. To achieve anytime prediction, existing methods only embed fixed exits to neural networks and make the predictions with the fixed exits for all the samples. However, it is observed that the latest exit within a time budget does not always provide a more accurate prediction than the earlier exits for testing samples of various difficulties. Motivated by this, we propose to improve the anytime prediction accuracy by allowing each sample to adaptively select its own optimal exit within a specific time budget. Specifically, we propose a new Routing Convolutional Network (RCN). For any given time budget, it adaptively selects the optimal layer as exit for a specific testing sample following the learned policy of the Q-network at the exit, considering both potential information gain and time-cost. The exits and the Q-networks are optimized alternatively to mutually boost each other under the cost-sensitive environment. Apart from anytime image classification, RCN can also be adapted to pixel-wise prediction tasks, e.g., scene parsing. Extensive experimental results on CIFAR-10, CIFAR-100 and ImageNet and Cityscapes benchmarks demonstrate the efficacy of RCN for anytime recognition.
Reference Key
jie2019anytimeieee Use this key to autocite in the manuscript while using SciMatic Manuscript Manager or Thesis Manager
Authors Jie, Zequn;Sun, Peng;Li, Xin;Feng, Jiashi;Liu, Wei;
Journal ieee transactions on pattern analysis and machine intelligence
Year 2019
DOI
10.1109/TPAMI.2019.2959322
URL
Keywords

Citations

No citations found. To add a citation, contact the admin at info@scimatic.org

No comments yet. Be the first to comment on this article.