AI Terms Glossary - Activation Functions


The use of activation functions rather than linear functions in traditional regression models gives neural networks a lot of their strength.

In most neural networks, the inputs to a node are weighted and then summed.

After that, a non-linear activation function is applied to the total.

Although output nodes should have activation functions suited to the distribution of the output variables, these functions are often sigmoidal (monotone rising) functions like a logistic or Gaussian function.

In statistical generalized linear models, activation functions are closely connected to link functions and have been extensively researched in that context.

See Also: 


~ Jai Krishna Ponnappan

Find Jai on Twitter | LinkedIn | Instagram

Be sure to refer to the complete & active AI Terms Glossary here.

You may also want to read more about Artificial Intelligence here.

What Is Artificial General Intelligence?

Artificial General Intelligence (AGI) is defined as the software representation of generalized human cognitive capacities that enables the ...