Subscribe to DSC Newsletter

Last year before Christmas at Hochschule München, Fakultät für Informatik and Mathematik I presented about Deep Learning (nbviewer, github, pdf).

Mainly concepts (what's "deep" in Deep Learning, backpropagation, how to optimize ...) and architectures (Multi-Layer Perceptron, Convolutional Neural Network, Recurrent Neural Network), but also demos and code examples (mainly using TensorFlow).

Source: click on the pdf link (above)

It was/is a lot material to cover in 90 minutes, and conceptual understanding / developing intuition was the main point. Of course, there is great online material to make use of, and you'll see my preferences in the cited sources ;-).

This year, having covered the basics, I hope to be developing use cases and practical applications showing applicability of Deep Learning even in non-Google-size (resp: Facebook, Baidu, Apple...) environments.
Stay tuned!

For original post click here.

Views: 6052

Tags: ConvolutionalNeuralNetwork, DeepLearning, MachineLearning, RecurrentNeuralNetwork, TensorFlow

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Fernando Barbosa on January 5, 2017 at 9:49am

Hi Sigrid, thank you for sharing!

I am researching about Machine Learning with particular interest in Deep Learning. Would you have an opinion on what are the main challenges of Deep Learning from a mathematical standpoint? I have noticed that the there seems to be a lot of concern with the computational part as you mentioned but what about the math/physics of deep learning? I would be very glad if you or any members could take the time to share their thoughts.

Thanks again!

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service