Memristive Stochastic Computing for Deep Learning Parameter Optimization
Abstract
Stochastic Computing (SC) is a computing paradigm that allows for the lowcost and lowpower computation of various arithmetic operations using stochastic bit streams and digital logic. In contrast to conventional representation schemes used within the binary domain, the sequence of bit streams in the stochastic domain is inconsequential, and computation is usually nondeterministic. In this brief, we exploit the stochasticity during switching of probabilistic Conductive Bridging RAM (CBRAM) devices to efficiently generate stochastic bit streams in order to perform Deep Learning (DL) parameter optimization, reducing the size of Multiply and Accumulate (MAC) units by 5 orders of magnitude. We demonstrate that in using a 40nm Complementary Metal Oxide Semiconductor (CMOS) process our scalable architecture occupies 1.55mm$^2$ and consumes approximately 167$\mu$W when optimizing parameters of a Convolutional Neural Network (CNN) while it is being trained for a character recognition task, observing no notable reduction in accuracy posttraining.
 Publication:

arXiv eprints
 Pub Date:
 March 2021
 arXiv:
 arXiv:2103.06506
 Bibcode:
 2021arXiv210306506L
 Keywords:

 Computer Science  Emerging Technologies;
 Computer Science  Artificial Intelligence;
 Computer Science  Hardware Architecture;
 Computer Science  Machine Learning
 EPrint:
 Accepted by IEEE Transactions on Circuits and Systems Part II: Express Briefs