Comments - New Perspectives on Statistical Distributions and Deep Learning - Data Science Central2019-08-19T00:04:53Zhttps://www.datasciencecentral.com/profiles/comment/feed?attachedTo=6448529%3ABlogPost%3A803958&xn_auth=noI have obtained very interest…tag:www.datasciencecentral.com,2019-03-08:6448529:Comment:8081112019-03-08T12:53:12.643ZJim Chappellhttps://www.datasciencecentral.com/profile/JimChappell
<p>I have obtained very interesting results using quantiles and modelling with GLD (Generalized Lamda Distribution). I find Quantiles useful for simulations. They are worth a long look.</p>
<p></p>
<p>I have obtained very interesting results using quantiles and modelling with GLD (Generalized Lamda Distribution). I find Quantiles useful for simulations. They are worth a long look.</p>
<p></p> Very interesting! It makes se…tag:www.datasciencecentral.com,2019-03-03:6448529:Comment:8066652019-03-03T11:16:09.923ZRohan Kotwanihttps://www.datasciencecentral.com/profile/RohanKotwani
<p>Very interesting! It makes sense that the mixture models is unstable and better for fitting an arbitrary distribution. I think having a weight sum of distributions is practical for when the underlying distribution have many anomalous events. My intuition is that GMM have a tendency to push clusters apart since having overlapping distributions would lower the MLE. </p>
<p>Very interesting! It makes sense that the mixture models is unstable and better for fitting an arbitrary distribution. I think having a weight sum of distributions is practical for when the underlying distribution have many anomalous events. My intuition is that GMM have a tendency to push clusters apart since having overlapping distributions would lower the MLE. </p>