From the course: Training Neural Networks in Python

Unlock the full course today

Join today to access over 23,200 courses taught by industry experts.

Activation functions

Activation functions

- [Instructor] We are almost there but our neuron is still missing something. So let me tell you what's wrong with weighted sums. There are two inconveniences I'd like to mention. First, values aren't constrained so a sum may sometimes result in a very large value or a very small value. Second, a weighted sum is a linear function. So the threshold to fire is not very well defined. That is, a change between true and false is not very notable. And most importantly, it's not easily trained. It turns out that other functions that make learning easier are non-linear. This is the real reason to add an element to our neuron. So what's wrong with having a very large and a very small value? Consider this example where we have a two input neuron and we are feeding 1000 to X0 and two to X1. For now, let's leave the bias weight at zero so the bias is not shown to keep the diagram simple. If we run the neuron, we'll have a result of…

Contents