Last year before Christmas at Hochschule München, Fakultät für Informatik and Mathematik I presented about Deep Learning (nbviewer, github, pdf).
Mainly concepts (what's "deep" in Deep Learning, backpropagation, how to optimize ...) and architectures (Multi-Layer Perceptron, Convolutional Neural Network, Recurrent Neural Network), but also demos and code examples (mainly using TensorFlow).
Source: click on the pdf link (above)
It was/is a lot material to cover in 90 minutes, and conceptual understanding / developing intuition was the main point. Of course, there is great online material to make use of, and you'll see my preferences in the cited sources ;-).
This year, having covered the basics, I hope to be developing use cases and practical applications showing applicability of Deep Learning even in non-Google-size (resp: Facebook, Baidu, Apple...) environments.
Stay tuned!
For original post click here.
Views: 6528
Tags: ConvolutionalNeuralNetwork, DeepLearning, MachineLearning, RecurrentNeuralNetwork, TensorFlow
Comment
Hi Sigrid, thank you for sharing!
I am researching about Machine Learning with particular interest in Deep Learning. Would you have an opinion on what are the main challenges of Deep Learning from a mathematical standpoint? I have noticed that the there seems to be a lot of concern with the computational part as you mentioned but what about the math/physics of deep learning? I would be very glad if you or any members could take the time to share their thoughts.
Thanks again!
Posted 1 March 2021
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles
You need to be a member of Data Science Central to add comments!
Join Data Science Central