Rectified Tangent Activation (RTA): A Novel Activation Function for Enhanced Deep Learning Performance

In deep learning, activation functions (AFs) influence a model’s performance, convergence rate, and generalization capability. Conventional activation functions such as ReLU, Swish, ELU, and Tanh have been widely utilized, each offering distinct advantages but also exhibiting intrinsic dr...

Full description

Saved in:
Bibliographic Details
Main Authors: Gaurav Kumar Pandey, Sumit Srivastava
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11075670/
Tags: Add Tag
No Tags, Be the first to tag this record!