You Already Have It: A Generator-Free Low-Precision DNN Training Framework Using Stochastic Rounding

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Stochastic rounding is a critical technique used in low-precision deep neural networks (DNNs) training to ensure good model accuracy. However, it requires a large number of random numbers generated on the fly. This is not a trivial task on the hardware platforms such as FPGA and ASIC. The widely used solution is to introduce random number generators with extra hardware costs. In this paper, we innovatively propose to employ the stochastic property of DNN training process itself and directly extract random numbers from DNNs in a self-sufficient manner. We propose different methods to obtain random numbers from different sources in neural networks and a generator-free framework is proposed for low-precision DNN training on a variety of deep learning tasks. Moreover, we evaluate the quality of the extracted random numbers and find that high-quality random numbers widely exist in DNNs, while their quality can even pass the NIST test suite.

Cite

CITATION STYLE

APA

Yuan, G., Chang, S. E., Jin, Q., Lu, A., Li, Y., Wu, Y., … Wang, Y. (2022). You Already Have It: A Generator-Free Low-Precision DNN Training Framework Using Stochastic Rounding. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13672 LNCS, pp. 34–51). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19775-8_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free