The Emerging Technologies Event Horizon
It’s Gartner Hype Cycle Day! This is roughly analogous to the Oscars for emerging technologies, in which the Gartner Group looks through their telescopes to see what could end up being the next big thing over the course of the next decade. The idea behind the hype cycle is an intriguing concept – emerging technologies seem to come out of nowhere, explode into prominence while the technology is still less than fully mature, then seemingly disappear for a while as the promise of the hype gives way to the reality of implementation and real-world problems. Once these are worked out, the technology in question gets adopted into the mainstream.
The hype cycle is frequently the source of investor bingo, where young startups pick a term that most closely reflects their business model, then pitch it to angel investors to provide supporting evidence that their product is worth investing in. It also gets picked up by the marketing departments of existing companies that are working at marketing their newest offerings to customers, For this reason, the Gartner Hype Cycle (GHC) should always be taken as a speculative guide, not a guarantee.
Nonetheless, from the standpoint of data science, this year’s graph is quite intriguing. The GHC looks at emerging technologies, which in general can be translated to mean that even the most immediate items on the curve are at least two years out, with the rest either between five and ten years out or beyond. In the immediate term, what emerges is that distributed identity, of individuals, organizations (LEIs), or things (NFTs) will become a critical part of any future technologies. This makes sense – these effectively provide the hooks that connect the physical world and its virtual counterpart, and as such become essential in the production of digital twins, one of the foundations of the metaverse. Similarly, generative AI, which takes data input and uses that to create relevant new content, has obvious implications for virtual reality (which is increasingly coming under the heading of Multiexperience).
A second trend that can be discerned is the shift away from application development as a predominantly human activity to become instead something that is constructed by a subject matter expert, data analyst, or decision-maker. This is the natural extension of the DevOps movement, which took many of the principles of Agile and concentrated primarily on automating that which could be automated. This can also be seen in the rise of composable applications and networks. Indeed, this simply continues a trend where the creation of specific algorithms by software developers is being replaced by the development of models by analysts, and as more of that becomes automated, the next logical step is the compartmentalization of such analysis within generated software and configurable pipelines.
The final trend, and one that looks to become a reality around 2035, is the long term integration of social structures with the emerging physical structures. For instance, decentralized finance is seen as still being some ways out, even with blockchain and similar distributed ledger technology becoming pervasive today. This is because finance, while technologically augmented at this point, still involves social integration, something which doesn’t even remotely exist at this stage. Similarly machine-readable (and potentially writeable) legislation falls into the domain of social computing at a very deep level, and requires a level of trust-building that looks to be at least a decade out or more.
All of these revolve around graphs in one way or another, to the extent that we may now be entering into the graph era, in which the various social, financial, civic, and even ecological graphs can be more readily articulated. The biggest barrier to implementing the metaverse, ultimately, will be graph integration, and it is likely that this, coupled with increasing integration of graph and machine learning technologies, will dominate over the next couple of decades.
One of the left-most (meaning most speculative) technologies listed here is Quantum ML. If the period from 1990 to 2015 could be considered the relational age (defined by SQL), and 2015 to 2040 is the graph age, then quantum machine learning will likely end up marking the end of the graph era and the beginning of the quantum data era. This is perhaps significant, as 2040 also marks the beginning of the singularity by many different measures, something I will be writing more about in-depth soon.
In media res,
To subscribe to the DSC Newsletter, go to Data Science Central and become a member today. It’s free!
Data Science Central Editorial Calendar
DSC is looking for editorial content specifically in these areas for July, with these topics having higher priority than other incoming articles.
DSC Featured Articles
Picture of the Week
To make sure you keep getting these emails, please add [email protected] to your browser’s address book.
Join Data Science Central | Comprehensive Repository of Data Science and ML Resources
This email, and all related content, is published by Data Science Central, a division of TechTarget, Inc.
275 Grove Street, Newton, Massachusetts, 02466 US
copyright 2021 TechTarget, Inc. all rights reserved. Designated trademarks, brands, logos and service marks are the property of their respective owners.