The use of activation functions rather than linear functions
in traditional regression models gives neural networks a lot of their strength.
In most neural networks, the inputs to a node are weighted
and then summed.
After that, a non-linear activation function is applied to
the total.
Although output nodes should have activation functions
suited to the distribution of the output variables, these functions are often
sigmoidal (monotone rising) functions like a logistic or Gaussian function.
In statistical generalized linear models, activation
functions are closely connected to link functions and have been extensively
researched in that context.
See Also:
Softmax.
~ Jai Krishna Ponnappan
Find Jai on Twitter | LinkedIn | Instagram
You may also want to read more about Artificial Intelligence here.