Here I present the backpropagation algorithm for a continuous target variable and no activation function in hidden layer: although simpler than the one used for the logistic cost function, it's a proficuous field for math lovers.
This post is the outcome of my studies in Neural Networks and a sketch for application of the Backpropagation algorithm. It's a binary classification task with N = 4 cases in a Neural Network with a single hidden layer. After the hidden layer and the output layer there are sigmoid activation functions. Different colors were used in the Matrices, same color as the Neural Network structure (bias, input, hidden, output) to make it easier to understand.…Continue
Last year I started developing a Face Recognition model. I started with static pictures and using Wolfram Mathematica. This year I found out we can do the same job using OpenCV in Python, or creating specific filters in R and applying Weierstrass and Gaussian transformation.
There are lots of difficulties in recognizing faces of the same person, like: position, rotation of face, age, feeling, brightness, gamma, contrast, gamma, saturation, obstacles like hands,hair and so…Continue
Added by Rubens Zimbres on October 15, 2016 at 4:00am — No Comments
Lately I've been doing some experiences with Theano and Deep Learning. One thing that I really thought could help is to understand the workflow of a Theano algorithm through visualization of tensors' connections. After developing the model, I printed the prediction algorithm for a deep learning Neural Net with 2 hidden layers, 2 inputs X1 and X2, and a continuous output Y. I used Graphviz and pydot to generate the graphic with this line of…Continue