A non-polynomial, non-sigmoidal, bounded and symmetric activation function for feed – Forward artificial neural networks

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Feed-forward artificial neural networks are universal approximators of continuous functions. This property enables the use of these networks to solve learning tasks. Learning tasks in this paradigm are cast as function approximation problems. The universal approximation results for these networks require at least one hidden layer with non-linear nodes, and also require that the non-linearities be non-polynomial in nature. In this paper a non-polynomial and non-sigmoidal non-linear function is proposed as a suitable activation function for these networks. The usefulness of the proposed activation function is shown on 12 function approximation task. The obtained results demonstrate that the proposed activation function outperforms the logistic/ log-sigmoid and the hyperbolic tangent activation functions.

Cite

CITATION STYLE

APA

Sood, A., Chandra, P., & Ghose, U. (2019). A non-polynomial, non-sigmoidal, bounded and symmetric activation function for feed – Forward artificial neural networks. International Journal of Innovative Technology and Exploring Engineering, 8(12), 405–410. https://doi.org/10.35940/ijitee.L3313.1081219

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free