Subscribe to DSC Newsletter

This post is the outcome of my studies in Neural Networks and a sketch for application of the Backpropagation algorithm. It's a binary classification task with N = 4 cases in a Neural Network with a single hidden layer. After the hidden layer and the output layer there are sigmoid activation functions. Different colors were used in the Matrices, same color as the Neural Network structure (bias, input, hidden, output) to make it easier to understand.

Views: 13248


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Lance Norskog on July 5, 2017 at 6:19pm


This is a convolutional NN. Another structure is recurrent NNs. My intuition, but I haven't worked through it, is that the classic "discovering the weather" Hidden Markov Model example could be implemented with a very small recurrent NN.

This is a description of the Hidden Markov Model example:

Comment by Rohan Kotwani on December 25, 2016 at 5:28am

This is actually a pretty good representation of the matrix multiplications! How did you come up with the Weights? I noticed the logit formula multiplied by the weights, but did the weights change as a result?

Follow Us


  • Add Videos
  • View All


© 2017   Data Science Central   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service