·
Activation functions are also called
Transfer Functions.
· ·
An activation function
is a function applied to the net output of any node before it is
fed to the next layer.
· · Various types of activation functions are used in neural networks. They are as follows:
Binary step function:
· Single layer networks often use a step function to convert the net input to an output unit that is a binary (1 or 0) or bipolar (1 or -1) signal.
· This binary step function is also called
Threshold function or Heaviside function.
It is shown in the figure below
Binary sigmoid
function:
·
Sigmoid functions (S-shaped
curves) are shown
in the figure below –
·
Logistic function and Hyperbolic tangent
function are the most common
ones.
·
They are especially advantageous for use
in neural networks trained by back propagation.
·
The logistic function
is a sigmoid function with range from 0 to 1.
·
It is often used as the activation
function for neural networks in which the desired
output values are either binary
or in the interval b/w 0 and 1.
·
To emphasize the range of the function,
we call it Binary sigmoid.
Bipolar sigmoid function:
- ·The logistic sigmoid function can be
scaled to have any range of values that is appropriate for a given problem.
·
The most common
range is -1 to 1.
·
We call it the Bipolar
Sigmoid.
·
It is shown in the figure below –
No comments:
Post a Comment