Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training.
Clicks: 147
ID: 30607
2018
This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism together with the precomputation of softmax normalization constants for these inputs. The proposed approach is empirically validated, showing their capability to emulate lower order N-grams with a single Neural Network. A machine translation task shows that the proposed model constitutes a good solution to the normalization cost of the output softmax layer of Neural Networks, for some practical cases, without a significant impact in performance while improving the system speed.
Reference Key |
zamoramartnez2018fallbackplos
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
---|---|
Authors | Zamora-Martínez, Francisco J;España-Boquera, Salvador;Castro-Bleda, Maria Jose;Palacios-Corella, Adrian; |
Journal | PloS one |
Year | 2018 |
DOI | DOI not found |
URL | |
Keywords | Keywords not found |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.