Subscribe to DSC Newsletter

Hi Everyone,

Recently i gave an interview for data science opening and interviewer asked optimisation algorithm for logistic regression. I answered gradient descent and explained same.

He said these days nobody use gradient descent? Is that true and if yes what are alternatives.

What are disadvantages of gradient descent because of which it is not used anymore?

Thanks for your help in advance.

Views: 467

Reply to This

Replies to This Discussion

Here is a great alternative: swarm optimization, see https://www.datasciencecentral.com/profiles/blogs/swarm-optimizatio...

The animated picture below shows how it works:

Genetic Algorithms may also be of interest. In general, the problem domain is that of Global Optimization.

I think the interviewer was looking for some variation of gradient descent, because according to my knowledge gradient descent is used and used heavily; but with some tweaks. Check this momentum-rmsprop-and-adam, it'll give the idea.

I think the interviewer its confused ... Stochastic gradient descent is heavily used and its the "de facto" method for learning

Of course there are very good alternatives like Swarm optimization, but I will not say that nobody use SGD nowadays ...

Thanks Vincent will try to implement and replace with gradient descent.

Vincent Granville said:

Here is a great alternative: swarm optimization, see https://www.datasciencecentral.com/profiles/blogs/swarm-optimizatio...

The animated picture below shows how it works:

I also think so will try to compare both stochastic and swarm. Thanks


Eduardo Di Santi said:

I think the interviewer its confused ... Stochastic gradient descent is heavily used and its the "de facto" method for learning

Of course there are very good alternatives like Swarm optimization, but I will not say that nobody use SGD nowadays ...

I don't agree with the interviewer fully. ADAM, ADAGRAD,SGD are some of the popular flavors of gradient descent method used in solving real world problems when employing regression (of any kind) techniques.

May be there is a specific problem he had in mind....

That is a very nice article. Thanks for sharing.

surya prakash Sahu said:

I think the interviewer was looking for some variation of gradient descent, because according to my knowledge gradient descent is used and used heavily; but with some tweaks. Check this momentum-rmsprop-and-adam, it'll give the idea.

Reply to Discussion

RSS

Follow Us

Resources

© 2018   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service