In April, Gil Press posted a list of top 10 hot big data technologies in Forbes Magazine. The technologies being featured as hot were:

  • Predictive analytics
  • NoSQL databases
  • Search and knowledge discovery
  • Stream analytics
  • In-memory data fabric
  • Distributed file stores
  • Data virtualization
  • Data integration
  • Data preparation (automation)
  • Data quality

The article came with a cool chart (produced by Forrester Research, and similar to other charts produced by Gartner) :

What do you think are the hottest technologies now? Do you agree with Gil's list?

Personnally, I think that the hottest technologies (related to data science) are:

  • Adapting modern predictive algorithms to a distributed architecture
  • AI (pattern recognition)
  • Deep learning (the interection of AI and machine learning)
  • Data science automation
  • Leveraging data from sensors (IoT)
  • Turning unstructured data into structured data, and data standardization
  • Blending multiple predictive models together
  • Intensive data and model simulation (Monte-Carlo or Bayesian methods), to study complex systems such as weather, using HPC (high performance computing)

Related articles: click here to read more than 10 articles (featured in the picture below) about predictions that will impact our profession. It would be interesting to see how many still make sense today. For older predictions (2015) and see which ones turned out to be correct, click here

DSC Resources

Additional Reading

Follow us on Twitter: @DataScienceCtrl | @AnalyticBridge

Views: 38674


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Reuben Kim Hine on May 23, 2016 at 4:37am

Hello Vincent,

In regards to:

  • Adapting modern predictive algorithms to a distributed architecture,

are you including Hadoop? I am waiting on a $269 laptop with I5 cpu and 8 gb of ram, which I plan to build Hadoop. I am certainly not the most learned on this site, but Hadoop not only seems to be a buzzword, but a powerful framework for computing large data sets quickly with a lot of sophistication. I plan to see how well it deals with a data set that has 3 million records and 225 columns. In particular, I plan to use R as the calculation tool.

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service