This post is the outcome of my studies in Neural Networks and a sketch for application of the Backpropagation algorithm. It's a binary classification task with N = 4 cases in a Neural Network with a single hidden layer. After the hidden layer and the output layer there are sigmoid activation functions. Different colors were used in the Matrices, same color as the Neural Network structure (bias, input, hidden, output) to make it easier to understand.

Views: 39732

Tags: learning, machine, networks, neural, rubens, zimbres


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Mark Chesney on February 6, 2018 at 10:38am

Great graphic!  Really helps visualize lots of the complex techniques.

One question: Why is the column of the first (left-hand) matrix colored red, along with the row of the second (right-hand) matrix?  It almost begins to suggest that the elements of the red column are multiplied by the red row, which have a length mismatch (4 elements vs 3).  Perhaps I misunderstood the purpose of the color highlighting?

Comment by Emanuel Woiski on December 30, 2017 at 12:42am

awesome, thanks.

Comment by Lance Norskog on July 5, 2017 at 6:19pm


This is a convolutional NN. Another structure is recurrent NNs. My intuition, but I haven't worked through it, is that the classic "discovering the weather" Hidden Markov Model example could be implemented with a very small recurrent NN.

This is a description of the Hidden Markov Model example:


Comment by Rohan Kotwani on December 25, 2016 at 5:28am

This is actually a pretty good representation of the matrix multiplications! How did you come up with the Weights? I noticed the logit formula multiplied by the weights, but did the weights change as a result?

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service