In the last couple of months, I’ve been noticing a gradual shift in the kind of articles that we receive at Data Science Central. We still get a fair amount of data science content, but increasingly (and admittedly with a bit of encouragement) we’re seeing more articles centered around graphs and semantics. I don’t believe this is coincidental.
Machine learning has dominated coverage over the last couple of years as we’ve learned how to take advantage of ML systems for classification, natural language processing, and generation, and reinforcement learning for visual recognition systems and even for trends detection. However, there’s a definite sense, both in the discussions that I’ve had as an editor and analyst and what I’ve been seeing in the trade press, that the current iteration of these technologies has reached their peak and that before they can evolve other things will need to fall into place first. This is a very familiar pattern in IT – innovation comes in waves that eventually peter out because the technology has reached a limit of what it can do without other pieces of the stack evolving.
I think that one of those pieces will end up being a resurgence in the use of graphs, especially semantic graphs, in a wide number of areas. This will likely take place in two phases, one going on now and lasting until about 2025, and a second, more sophisticated phase until about 2029 as GPUs come to dominate the cloud. Graph operations, like machine learning, tend to be heavily parallel and compute-intensive, things that GPUs do far better than CPUs.
One thing that I’ve noticed recently has been that there’s also a new emergence of what I’m calling Subgraph technology, in which resources (especially text-heavy ones) are analyzed at the word level then reconstructed to be able to create a deeper compositional graph. This is computationally expensive and requires a significant index, but the benefits, especially with narrative content, are impressive. It provides the ability to extract information at a level that was previously thought impossible and can link this data with positional information – tables, parts diagrams, blueprints, and the like.
At the same time, projects such as the W3C Solid project are beginning to come online, and while not quite ready for prime time, are beginning to show some real promise. Similar initiatives are taking place with the Spatial Web, which looks to be starting to do what I’d recommend more than six years ago: take advantage of graphs, not for games, but for IoT, especially industrial IoT (IIoT). As I’ve asserted in previous blog posts on DSC and elsewhere, graphs are the virtual representations or abstraction of networks of various types and IIoT is nothing if not a very complex network.
At the same time, there’s a growing backlash against the slick and fairly hollow view of the Metaverse and Decentralized Finance (DeFi) that several Social Media companies have been trying to promote, featured this week in XR Fuels Consumerism, With a Major Downside by Stephanie Glen and DeFi platforms: What dumb data and dumb code have in common by Alan Morrison. Part of this has to do with the limited thought put into either platform beyond inherent hucksterism in the rush to be the first to market in a still-evolving field. There are some solutions that again involve graph technologies that could significantly expand upon both the range and the trustworthiness of these solutions, but right now they are being crowded out by the noise.
In media res,
To subscribe to the DSC Newsletter, go to Data Science Central and become a member today. It’s free!
Data Science Central Editorial Calendar
DSC is looking for editorial content specifically in these areas for March, with these topics having higher priority than other incoming articles.
- AI-Enabled Hardware
- Knowledge Graphs
- GANs and Simulations
- ML in Weather Forecasting
- UI, UX and AI
- GNNs and LNNs
- Digital Twins
DSC Featured Articles
- Data Observability Goes Far Beyond Data Quality Monitoring and Alerts Sameer Narkhede on 22 Feb 2022
- DeFi platforms: What dumb data and dumb code have in common Alan Morrison on 22 Feb 2022
- An Overview of the Big Data Engineer Aileen Scott on 22 Feb 2022
- A Framework for Understanding Transformation–Part III Howard M. Wiener on 22 Feb 2022
- Abundance Mentality is Key to Exploiting the Economics of Data Bill Schmarzo on 22 Feb 2022
- How to Ensure Data Quality and Integrity? Indhu on 22 Feb 2022
- IoT in Construction — Construction Technology’s Next Frontier Nikita Godse on 22 Feb 2022
- Scene Graphs and Semantics Kurt Cagle on 22 Feb 2022
- AWS Cloud Security: Best Practices Ryan Williamson on 22 Feb 2022
- Top Best Practices to Keep in Mind for Azure Cloud Migration Ryan Williamson on 22 Feb 2022
- DSC Weekly Newsletter 15 Feb 2022: On the Verge of War Kurt Cagle on 22 Feb 2022
- The One Algorithm Change That Could Impact the World ajitjaokar on 22 Feb 2022
- XR Fuels Consumerism, With A Major Downside Stephanie Glen on 21 Feb 2022
- Exploring Intelligent Graph and PathQL Peter Lawrence on 21 Feb 2022
- Dual Confidence Regions: A Simple Introduction Vincent Granville on 21 Feb 2022
- 5 Easy Steps to Choosing a Great Data Visualization Platform for Your Business ImensoSoftware on 17 Feb 2022
- Calling All Data Scientists: Data Observability Needs You Henry Li on 16 Feb 2022
Picture of the Week