Search Results - "activation function"

  • Showing 1 - 18 results of 18
Refine Results
  1. 1

    THE SOLUTION OF THE APPROXIMATION PROBLEM OF NONLINEAR DEPENDANCES USING ARTIFICIAL NEURAL NETWORKS by V. N. Ageyev

    Published 2018-04-01

    The paper discusses issues connected with the use of an artificial neural network (ANN) to approximate the experimental data. One of the problems in the development of the ANN is the choice of an appropriate activation function for neurons of the hidden layer and adjusting the parameters of the func...

    Full description

    Subjects: Get full text
    Article
  2. 2

    Influence of the Neural Network Hyperparameters on its Numerical Conditioning by S. V. Sholtanyuk

    Published 2020-04-01

    In this paper, the task of assessment of numerical conditioning of multilayer perceptron, forecasting time series with sliding window method, has been considered. Performance of the forecasting perceptron with various hyperparameters sets, with different amount of neurons and various activation func...

    Full description

    Subjects: Get full text
    Article
  3. 3

    Optimization of CNN Activation Functions using Xception for South Sulawesi Batik Classification by Aswan Aswan, Eva Yulia Puspaningrum, Billy Eden William Asrul

    Published 2025-09-01

    Batik motifs from South Sulawesi such as the Pinisi boat, Lontara script, Tongkonan house and Toraja combinations embody rich cultural narratives but are difficult to identify automatically. Automatic classification supports cultural preservation and education and empowers tourism and digital herita...

    Full description

    Subjects: “…convolutional neural network, activation function, xception, south sulawesi batik, classification…”
    Get full text
    Article
  4. 4

    Rectified Tangent Activation (RTA): A Novel Activation Function for Enhanced Deep Learning Performance by Gaurav Kumar Pandey, Sumit Srivastava

    Published 2025-01-01

    In deep learning, activation functions (AFs) influence a model’s performance, convergence rate, and generalization capability. Conventional activation functions such as ReLU, Swish, ELU, and Tanh have been widely utilized, each offering distinct advantages but also exhibiting intrinsic dr...

    Full description

    Subjects: Get full text
    Article
  5. 5

    Skew Logistic Distribution Applied as Activation Function in Artificial Neural Networks by Eder Silva Dos Santos, Altemir da Silva Braga, Ana Beatriz Alvarez, Thuanne Paixao

    Published 2025-01-01

    In recent years, Artificial Neural Networks (ANNs) have stood out among machine learning algorithms in many applications, such as image and video pattern recognition. Activation functions play a crucial role in the operation of these algorithms, directly influencing the effectiveness of ANNs. The lo...

    Full description

    Subjects: “…Activation function…”
    Get full text
    Article
  6. 6

    From Sigmoid to SoftProb: A novel output activation function for multi-label learning by Khudran M. Alzhrani

    Published 2025-10-01

    Multi-label classification is a crucial machine learning task that assigns multiple labels to a single instance, making it distinct from traditional single-label classification. The sigmoid activation function, commonly used in multi-label learning, suffers from saturation and vanishing gradient iss...

    Full description

    Subjects: Get full text
    Article
  7. 7

    mTanh: A Low-Cost Inkjet-Printed Vanishing Gradient Tolerant Activation Function by Shahrin Akter, Mohammad Rafiqul Haider

    Published 2025-05-01

    Inkjet-printed circuits on flexible substrates are rapidly emerging as a key technology in flexible electronics, driven by their minimal fabrication process, cost-effectiveness, and environmental sustainability. Recent advancements in inkjet-printed devices and circuits have broadened their applicat...

    Full description

    Subjects: “…activation function…”
    Get full text
    Article
  8. 8

    Modified generative adversarial network and Pseudo- Zernike matrix features extraction for human-computer interactive gesture recognition by Jing Yu, Lu Zhao

    Published 2025-06-01

    Gesture recognition aims to understand the dynamic gestures of human body, and is one of the most important interaction methods in the field of human-computer interaction. In order to improve the current situation that the existing gesture recognition algorithms need a lot of training data, this pap...

    Full description

    Subjects: Get full text
    Article
  9. 9

    Laser-Induced Breakdown Spectroscopy Quantitative Analysis Using a Bayesian Optimization-Based Tunable Softplus Backpropagation Neural Network by Xuesen Xu, Shijia Luo, Xuchen Zhang, Weiming Xu, Rong Shu, Jianyu Wang, Xiangfeng Liu, Ping Li, Changheng Li, Luning Li

    Published 2025-07-01

    Laser-induced breakdown spectroscopy (LIBS) has played a critical role in Mars exploration missions, substantially contributing to the geochemical analysis of Martian surface substances. However, the complex nonlinearity of LIBS processes can considerably limit the quantification accuracy of convent...

    Full description

    Subjects: “…tunable Softplus activation function…”
    Get full text
    Article
  10. 10

    Implementation of a neural network model in the Statistica 12 for mudflow frequency forecasting by B. A. Ashabokov, A. A. Tashilova, L. A. Kesheva, N. V. Teunova

    Published 2025-04-01

    The article describes some principles of operation of an artificial neural network. It provides an example of implementing a neural network model by selecting its best architecture using the Statistica 12 software package. The article considers a method for neural network forecasting of a series of...

    Full description

    Subjects: Get full text
    Article
  11. 11

    EXPERIMENTAL RESEARCH AND ANALYSIS OF INFLUENCE OF PRINCIPAL PARAMETERS CONVOLUTIONAL NEURAL NETWORKS ON THE QUALITY OF THEIR TRAINING by Roman Mikhaylovich Nemkov, Oksana Stanislavovna Mezentseva

    Published 2022-05-01

    In state is to analyze the influence an activation function, strategy initialization and kind of normaliza- tion on the quality of a training and generalization.

    Subjects: Get full text
    Article
  12. 12

    Further Analysis on Fixed/Preassigned-Time Projective Synchronization of Discontinuous Fuzzy Delayed Inertial Neural Networks by Xiao Zhou, Jing Han, Yan Li, Guodong Zhang

    Published 2025-01-01

    Our goal is addressing the problems of preassigned-time projective synchronization and fixed-time projective synchronization of the system that studied in our article, and fuzzy terms, mixed delays, inertial terms and discontinuous activation function are also included in this system. And the result...

    Full description

    Subjects: Get full text
    Article
  13. 13

    The Use of Neural Networks in Distance Education Technologies for the Identification of Students by O. A. Kozlova, A. A. Protasova

    Published 2021-07-01

    Purpose of the research. The purpose of this research is to study the problems of the features of teaching technologies of modern artificial neural networks for carrying out the procedure of unambiguous authentication of students according to a pre-formed reference base of digital biometric characte...

    Full description

    Subjects: Get full text
    Article
  14. 14

    Hypergeometric Functions as Activation Functions: The Particular Case of Bessel-Type Functions by Nelson Vieira, Felipe Freitas, Roberto Figueiredo, Petia Georgieva

    Published 2025-07-01

    The choice of the activation functions in neural networks (NN) are of paramount importance in the training process and the performance of NNs. Therefore, the machine learning community has directed its attention to the development of computationally efficient activation functions. In this paper we i...

    Full description

    Subjects: “…activation functions…”
    Get full text
    Article
  15. 15

    On the Synergy of Optimizers and Activation Functions: A CNN Benchmarking Study by Khuraman Aziz Sayın, Necla Kırcalı Gürsoy, Türkay Yolcu, Arif Gürsoy

    Published 2025-06-01

    In this study, we present a comparative analysis of gradient descent-based optimizers frequently used in Convolutional Neural Networks (CNNs), including SGD, mSGD, RMSprop, Adadelta, Nadam, Adamax, Adam, and the recent EVE optimizer. To explore the interaction between optimization strategies and act...

    Full description

    Subjects: Get full text
    Article
  16. 16

    Models of efficiency of functioning in trading enterprises under conditions of economic growth by Ilyash Olha, Vasyltsiv Taras, Lupak Ruslan, Get’manskiy Volodymyr

    Published 2021-03-01

    The socio-economic situation in Ukraine suggests that there is insufficient research into the applicability of the model of economic development in forecasting the economic environment in which trade enterprises function. Researchers into issues relating to the efficiency of enterprises’ functioning...

    Full description

    Subjects: Get full text
    Article
  17. 17

    Study on mechanism of influence of low temperature freeze-thaw cycle on spontaneous combustion characteristics of coal gangue by Xun Zhang, Ning Ma, Bing Lu, Fengwei Dai, Ge Huang, Huimin Liang, Chen Yu

    Published 2025-09-01

    In order to investigate the influence of the number of freeze-thaw cycles (freeze-thaw) on the spontaneous combustion characteristics of coal gangue, this study simulated the alpine environment and examined the spontaneous combustion characteristics of coal gangue after undergoing multiple freeze-th...

    Full description

    Subjects: Get full text
    Article
  18. 18

    AUTOMATION OF TENSOR CALCULATIONS BASED ON NEURAL NETWORKS by A. N. Makokha, T. E. Tyshlyar

    Published 2022-08-01

    Tensor calculus is a necessary tool in various natural sciences.

    Subjects: Get full text
    Article