A comparative analysis of activation functions in neural networks: unveiling categories
Sara Bouraya, Abdessamad Belangour
Abstract
Activation functions (AFs) play a critical role in artificial neural networks, allowing for the modeling of complex, non-linear relationships in data. In this review paper, we provide an overview of the most commonly used AFs in deep learning. In this comparative study, we survey and compare the different AFs in deep learning and artificial neural networks. Our aim is to provide insights into the strengths and weaknesses of each AF and to provide guidance on the appropriate selection of AFs for different types of problems. We evaluate the most commonly used AFs, including sigmoid, tanh, rectified linear units (ReLUs) and its variants, exponential linear unit (ELU), and SoftMax. For each activation category, we discuss its properties, mathematical formulation (MF), and the benefits and drawbacks in terms of its ability to model complex, non-linear relationships in data. In conclusion, this comparative study provides a comprehensive overview of the properties and performance of different AFs, and serves as a valuable resource for researchers and practitioners in deep learning and artificial neural networks.
Keywords
Activation function; Artificial neural networks; Deep learning; Neural networks; Relu; Sigmoid; Tanh
DOI:
https://doi.org/10.11591/eei.v13i5.7274
Refbacks
There are currently no refbacks.
This work is licensed under a
Creative Commons Attribution-ShareAlike 4.0 International License .
<div class="statcounter"><a title="hit counter" href="http://statcounter.com/free-hit-counter/" target="_blank"><img class="statcounter" src="http://c.statcounter.com/10241695/0/5a758c6a/0/" alt="hit counter"></a></div>
Bulletin of EEI Stats
Bulletin of Electrical Engineering and Informatics (BEEI) ISSN: 2089-3191, e-ISSN: 2302-9285 This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU) .