This post is the outcome of my studies in Neural Networks and a sketch for application of the Backpropagation algorithm. It's a binary classification task with N = 4 cases in a Neural Network with a single hidden layer. After the hidden layer and the output layer there are sigmoid activation functions. Different colors were used in the Matrices, same color as the Neural Network structure (bias, input, hidden, output) to make it easier to understand.
Comment
Great graphic! Really helps visualize lots of the complex techniques.
One question: Why is the column of the first (left-hand) matrix colored red, along with the row of the second (right-hand) matrix? It almost begins to suggest that the elements of the red column are multiplied by the red row, which have a length mismatch (4 elements vs 3). Perhaps I misunderstood the purpose of the color highlighting?
awesome, thanks.
Nice!
This is a convolutional NN. Another structure is recurrent NNs. My intuition, but I haven't worked through it, is that the classic "discovering the weather" Hidden Markov Model example could be implemented with a very small recurrent NN.
This is a description of the Hidden Markov Model example:
http://www.cs.uml.edu/~grinstei/91.510/HMM/Class%20Sildes%20-%20Hid...
This is actually a pretty good representation of the matrix multiplications! How did you come up with the Weights? I noticed the logit formula multiplied by the weights, but did the weights change as a result?
© 2018 Data Science Central™ Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
You need to be a member of Data Science Central to add comments!
Join Data Science Central