On the effects of recursive convolutional layers in convolutional neural networks

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The Recursive Convolutional Layer (RCL) is a module that wraps a recursive feedback loop around a convolutional layer (CL). The RCL has been proposed to address some of the shortcomings of Convolutional Neural Networks (CNNs), as its unfolding increases the depth of a network without increasing the number of weights. We investigated the “naïve” substitution of CL with RCL on three base models: a 4-CL model, ResNet, DenseNet and their RCL-ized versions: C-FRPN, R-ResNet, and R-DenseNet using five image classification datasets. We find that this one-to-one replacement significantly improves the performances of the 4-CL model, but not those of ResNet or DenseNet. This led us to investigate the implication of the RCL substitution on the 4-CL model which reveals, among a number of properties, that RCLs are particularly efficient in shallow CNNs. We proceeded to re-visit the first set of experiments by gradually transforming the 4-CL model and the C-FRPN into respectively ResNet and R-ResNet, and find that the performance improvement is largely driven by the training regime whereas any depth increase negatively impacts the RCL-ized version. We conclude that the replacement of CLs by RCLs shows great potential in designing high-performance shallow CNNs.

Figures

References Powered by Scopus

Deep residual learning for image recognition

175065Citations
50380Readers
Get full text

ImageNet: A Large-Scale Hierarchical Image Database

51389Citations
9799Readers
Get full text

Going deeper with convolutions

39707Citations
23240Readers
Get full text

Cited by Powered by Scopus

0Citations
4Readers
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Chagnon, J., Hagenbuchner, M., Tsoi, A. C., & Scarselli, F. (2024). On the effects of recursive convolutional layers in convolutional neural networks. Neurocomputing, 591. https://doi.org/10.1016/j.neucom.2024.127767

Readers' Seniority

Tooltip

Lecturer / Post doc 1

100%

Readers' Discipline

Tooltip

Business, Management and Accounting 1

33%

Computer Science 1

33%

Engineering 1

33%

Article Metrics

Tooltip
Mentions
News Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free