Augmentations: An Insight into Their Effectiveness on Convolution Neural Networks

2Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Augmentations are the key factor in determining the performance of any neural network as they provide a model with a critical edge in boosting its performance. Their ability to boost a model’s robustness depends on two factors, viz-a-viz, the model architecture, and the type of augmentations. Augmentations are very specific to a dataset, and it is not imperative that all kinds of augmentation would necessarily produce a positive effect on a model’s performance. Hence there is a need to identify augmentations that perform consistently well across a variety of datasets and also remain invariant to the type of architecture, convolutions, and the number of parameters used. This paper evaluates the effect of parameters using 3 × 3 and depth-wise separable convolutions on different augmentation techniques on MNIST, FMNIST, and CIFAR10 datasets. Statistical Evidence shows that techniques such as Cutouts and Random horizontal flip were consistent on both parametrically low and high architectures. Depth-wise separable convolutions outperformed 3 × 3 convolutions at higher parameters due to their ability to create deeper networks. Augmentations resulted in bridging the accuracy gap between the 3 × 3 and depth-wise separable convolutions, thus establishing their role in model generalization. At higher number augmentations did not produce a significant change in performance. The synergistic effect of multiple augmentations at higher parameters, with antagonistic effect at lower parameters, was also evaluated. The work proves that a delicate balance between architectural supremacy and augmentations needs to be achieved to enhance a model’s performance in any given deep learning task.

Cite

CITATION STYLE

APA

Ethiraj, S., & Bolla, B. K. (2022). Augmentations: An Insight into Their Effectiveness on Convolution Neural Networks. In Communications in Computer and Information Science (Vol. 1613 CCIS, pp. 309–322). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-12638-3_26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free