Using photonic reservoirs as preprocessors for deep neural networks

4Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

Abstract

Artificial neural networks are very time consuming and energy intensive to train, especially when increasing the size of the neural network in an attempt to improve the performance. In this paper, we propose to preprocess the input data of a deep neural network using a reservoir, which has originally been introduced in the framework of reservoir computing. The key idea of this paper is to use such a reservoir to transform the input data into a state in a higher dimensional state-space, which allows the deep neural network to process the data with improved performance. We focus on photonic reservoirs because of their fast computation times and low-energy consumption. Based on numerical simulations of delay-based reservoirs using a semiconductor laser, we show that using such preprocessed data results in an improved performance of deep neural networks. Furthermore, we show that we do not need to carefully fine-tune the parameters of the preprocessing reservoir.

Cite

CITATION STYLE

APA

Bauwens, I., Van der Sande, G., Bienstman, P., & Verschaffelt, G. (2022). Using photonic reservoirs as preprocessors for deep neural networks. Frontiers in Physics, 10. https://doi.org/10.3389/fphy.2022.1051941

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free