Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory

27Citations
Citations of this article
37Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The development of hardware neural networks, including neuromorphic hardware, has been accelerated over the past few years. However, it is challenging to operate very large-scale neural networks with low-power hardware devices, partly due to signal transmissions through a massive number of interconnections. Our aim is to deal with the issue of communication cost from an algorithmic viewpoint and study learning algorithms for energy-efficient information processing. Here, we consider two approaches to finding spatially arranged sparse recurrent neural networks with the high cost-performance ratio for associative memory. In the first approach following classical methods, we focus on sparse modular network structures inspired by biological brain networks and examine their storage capacity under an iterative learning rule. We show that incorporating long-range intermodule connections into purely modular networks can enhance the cost-performance ratio. In the second approach, we formulate for the first time an optimization problem where the network sparsity is maximized under the constraints imposed by a pattern embedding condition. We show that there is a tradeoff between the interconnection cost and the computational performance in the optimized networks. We demonstrate that the optimized networks can achieve a better cost-performance ratio compared with those considered in the first approach. We show the effectiveness of the optimization approach mainly using binary patterns and apply it also to gray-scale image restoration. Our results suggest that the presented approaches are useful in seeking more sparse and less costly connectivity of neural networks for the enhancement of energy efficiency in hardware neural networks.

References Powered by Scopus

Regression Shrinkage and Selection Via the Lasso

35675Citations
N/AReaders
Get full text

Collective dynamics of 'small-world' networks

34515Citations
N/AReaders
Get full text

Neural networks and physical systems with emergent collective computational abilities.

13710Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Recent advances in physical reservoir computing: A review

1305Citations
N/AReaders
Get full text

Adaptive Global Sliding-Mode Control for Dynamic Systems Using Double Hidden Layer Recurrent Neural Network Structure

242Citations
N/AReaders
Get full text

Neulft: A Novel Approach to Nonlinear Canonical Polyadic Decomposition on High-Dimensional Incomplete Tensors

113Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Tanaka, G., Nakane, R., Takeuchi, T., Yamane, T., Nakano, D., Katayama, Y., & Hirose, A. (2020). Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory. IEEE Transactions on Neural Networks and Learning Systems, 31(1), 24–38. https://doi.org/10.1109/TNNLS.2019.2899344

Readers over time

‘19‘20‘21‘22‘23‘24‘25036912

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 12

55%

Professor / Associate Prof. 5

23%

Researcher 5

23%

Readers' Discipline

Tooltip

Engineering 8

42%

Computer Science 7

37%

Physics and Astronomy 2

11%

Mathematics 2

11%

Save time finding and organizing research with Mendeley

Sign up for free
0