Rubens Zimbres
• Male
• Sao Paulo
• Brazil

# Rubens Zimbres's Page

## Latest Activity

Apr 5
Mar 27
Feb 20
Dec 6, 2018
Rubens Zimbres updated their profile
Dec 2, 2018
Parmod Kumar liked Rubens Zimbres's profile
Nov 15, 2018
Parmod Kumar liked Rubens Zimbres's blog post A Cheat Sheet on Probability
Nov 15, 2018
Oct 4, 2018
Sep 24, 2018
Aug 14, 2018
James Arrow liked Rubens Zimbres's blog post A Cheat Sheet on Probability
Jun 14, 2018
"Thanks for sharing. Please check if Total Probability is correct , whether your notation is inconsistent or you are missing the P(1st red), P(1st green) etc…"
May 22, 2018
"Thanks for Sharing great P sheet"
May 22, 2018
Apr 25, 2018
"In the "Before" calculation, the last steps, g'(x)=g'(f(3)).10, why f(3)? We said that x=2, so, could it be the following instead? "g'(x)=g'(f(2)).10""
Apr 25, 2018
Apr 25, 2018

## Profile Information

Short Bio
Senior Data Scientist at Vecto Mobile - Brazil
My Web Site Or LinkedIn Profile
Field of Expertise
Data Science, Machine Learning, AI, Business Analytics, Deep Learning, IoT
Professional Status
Consultant
Years of Experience:
13
Vecto Mobile
Industry:
Telecommunications
Dr.
How did you find out about DataScienceCentral?
Interests:
Networking, Finding a new position, New venture
What is your Favorite Data Mining or Analytical Website?
http://github.com/rubenszimbres

## Rubens Zimbres's Blog

### A Cheat Sheet on Probability

Posted on November 21, 2016 at 2:28am

### Neural Networks: The Backpropagation algorithm in a picture

Posted on November 19, 2016 at 9:30am

Here I present the backpropagation algorithm for a continuous target variable and no activation function in hidden layer: although simpler than the one used for the logistic cost function, it's a proficuous field for math lovers.

### Matrix Multiplication in Neural Networks

Posted on November 16, 2016 at 11:00am

This post is the outcome of my studies in Neural Networks and a sketch for application of the Backpropagation algorithm. It's a binary classification task with N = 4 cases in a Neural Network with a single hidden layer. After the hidden layer and the output layer there are sigmoid activation functions. Different colors were used in the Matrices, same color as the Neural Network structure (bias, input, hidden, output) to make it easier to understand.…

Continue

Posted on October 27, 2016 at 7:30am

## Comment Wall (1 comment)

Join Data Science Central

At 9:50am on June 26, 2016, Syed Danish Ali said…

you are wlcome rubens . cheers! :)