Loss Function in Neural Network

 Loss function is gonna help us to understand, how are these weights and biases are gonna get adjusted. What we do is we look at the output and compare to actual output and it will adjust the weights accordingly.


How do we adjust that?

We use what's known as loss function. A loss function is essentially a way of calculating an error. 

Now there are ton of different loss functions, some of them are like mean squared error. 



Eventually, what the loss function does is, it tells how wrong our answer is.


May be our answer is 0.79 and our actual answer is 1, that was pretty close to 1. But we are gonna get the fact that, we are off by 0.21. But, what if we get 0.85, it is significantly better than 0.79, this is all gonna say is we are better by 0.15. So, we still gonna do significant amount of adjustment in the weights and biases.

So, what we need to do is we need to apply loss function that will give us a better kind of degree, how wrong and right we are.

And again these loss functions are non-linear functions. That means we are gonna add a higher degree of complexity to our model which will allow us to create a way more complex that can solve a better problems.



Post a Comment