Subscribe to DSC Newsletter

Feature Selection For Unsupervised Learning

This is my presentation for the IBM data science day, July 24.


After reviewing popular techniques used in supervised, unsupervised and semi-supervised machine learning, we focus on feature selection methods in these different contexts, especially the metrics used to assess the value of a feature or set of features, be it binary, continuous or categorical variables.

We go in deeper details and review modern feature selection techniques for unsupervised learning, typically relying on entropy-like criteria. While these criteria are usually model-dependent or scale-dependent, we introduce a new model-free, data-driven methodology in this context, with an application to an interesting number theory problem (simulated data set) in which each feature has a known theoretical entropy.

We also briefly discuss high precision computing as it is relevant to this peculiar data set, as well as units of information smaller than the bit.

To download the presentation, click here (PowerPoint document.)

DSC Resources

Views: 1895


You need to be a member of Data Science Central to add comments!

Join Data Science Central


  • Add Videos
  • View All

Follow Us

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service