Neural Networks: The Backpropagation algorithm in a picture

Here I present the backpropagation algorithm for a continuous target variable and no activation function in hidden layer: although simpler than the one used for the logistic cost function, it's a proficuous field for math lovers.

Views: 47738

Tags: backpropagation, cost, deep, derivatives, function, learning, networks, neural, picture, rubens, More…weights, zimbres


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Eduardo Vinas on April 25, 2018 at 4:29am

In the "Before" calculation, the last steps, g'(x)=g'(f(3)).10, why f(3)? We said that x=2, so, could it be the following instead?


Comment by Emanuel Woiski on December 30, 2017 at 12:41am

Thanks for sharing. Cheers.

Comment by David Barnes Peatling on October 10, 2017 at 8:26pm

I feel like this lends more understandable description to some of my old analysis shop's best practices (and i'm excited to translate this for mental health purposes) - thanks for this generally mind-calming share

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service