Earlier it was Random forest , the go-to algorithm for classification problems in most of the data science competitions. Correctly formulated problem , with smart feature engineering and minimal tuning of the RF algorithm ( ntree, mtry) using grid search could get you past the bulk of the crowd .

Then came Xgboost and it soon became the hot favorite.  It isn't very tough to say Deep learning is running the show at the moment. Although, GPU powered  deep learning frameworks, weren't accessible to everyone . The ones who could use it were reaping the benefits.

Then arrived H2o, bringing deep learning to R with ease; (although Darch and deepnet were already available in R, not as popular though) .

1 .MNIST  digit recognition Competition

Here is a demonstration of how deep learning made the lunch of the classic MNIST dataset, A digit recognition data, being used since long in the academic and research arena. A one-liner R  code running a deep learning algorithm with 3 hidden layers each having 1024,1024,2048 neurons  respectively , the non-linear differentiable activation function being rectifier with dropout; achieved an error rate of 0.83 % on the test data ! A world record on the data set : No distortion, no convolution, no ensemble, no unsupervised learning !

2. Airbnb competition (reward : potential interview at airbnb) : On Going - 10 days left  

In this recruiting competition, Airbnb challenges you to predict in which country a new user will make his or her first booking. Kagglers who impress with their answer (and an explanation of how they got there) will be considered for an interview for the opportunity to join Airbnb's Data Science and Analytics…