Parametric Leaky Tanh: A New Hybrid Activation Function for Deep Learning. (arXiv:2310.07720v1 [cs.LG])

Date:


Activation functions (AFs) are crucial components of deep neural networks
(DNNs), having a significant impact on their performance. An activation
function in a DNN is typically a smooth, nonlinear function that transforms an
input signal into an output signal for the subsequent layer. In this paper, we
propose the Parametric Leaky Tanh (PLTanh), a novel hybrid activation function
designed to combine the strengths of both the Tanh and Leaky ReLU (LReLU)
activation functions. PLTanh is differentiable at all points and addresses the
‘dying ReLU’ problem by ensuring a non-zero gradient for negative inputs,
consistent with the behavior of LReLU. By integrating the unique advantages of
these two diverse activation functions, PLTanh facilitates the learning of more
intricate nonlinear relationships within the network. This paper presents an
empirical evaluation of PLTanh against established activation functions, namely
ReLU, LReLU, and ALReLU utilizing five diverse datasets.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

Guidelines for Establishing a Business in Dubai Successfully

If you're considering establishing a business in Dubai, you're...

Why do people around the world want to move to Dubai?

Dubai, the glittering jewel of the Middle East, has...

Top Yacht Manufacturers in the MENA Region

The Middle East and North Africa (MENA) region is...

Yacht Manufacturers in Abu Dhabi: A Growing Industry

Abu Dhabi is a city that is known for...