Home » Technical Topics » Knowledge Engineering

DSC Weekly 26 July 2022: When Meetings Become Searchable

  • Kurt Cagle 

Announcements

  • As IT Service Management drives Digital Transformation, 2022 is seeing investments in the discipline continue to grow. Thanks to innovations in AI and ML, ITSM has enabled businesses to survive the pandemic and transform organizations. As new innovations continue to shape the face of ITSM, it’s time to take another look at the effect of automation on service management, DevOps, monitoring and metrics and more. Join the Driving Innovation & Automation in IT Service Management summit to hear from experts as they uncover advances across the ITSM spectrum and share how your business can benefit.
  • As cybersecurity risks evolve, it’s more important than ever for organizations to be aware of emerging threats and developments. In the four-day Combating Cyber Threats & Breach Prevention 2022 summit, leading security experts will share best-in-class strategies for keeping abreast of vulnerabilities, anticipating coming threats and baking security into a wide range of enterprise operations and applications. Register for free thought leadership from the world’s top speakers, vendors and evangelists in the form of live webinars, panel discussions, keynote presentations and webcam videos.

Business woman ,computer
The world of meetings is about to become transparent to the future.

When Meetings Become Searchable

A century from now, historians will remark on a transformation that seemed subtle at the time but will have huge ramifications over time. Specifically, 2020 will be seen as the year when meetings became transparent.

This may not seem like that big an event, but it’s worth putting this into perspective – until 2020, most meetings were typically transcribed by a stenographer if they were captured at all. The stenographer did not, in general, capture every word or even every conversation. Most of the time, they may have summarized the highlights of what was said, with the content (the steno pad) stored in a drawer, there to sit for years on end until someone finally threw everything out.

Several things have changed after Covid, but one of the biggest has been the degree to which online video meeting technology has completely rewritten the way we hold meetings. The moment that you integrate video meetings such as Microsoft Teams, Google Meet, Zoom or even newer platforms like the one that LinkedIn rolled out, the projector becomes irrelevant except in very large-scale venues. Hybrid meetings are possible but awkward because you’re dealing with two competing communication channels. Online meetings are outcompeting the lectern model, which, in turn, is forcing (perhaps even has forced) the migration to primarily online meeting communication, even if all participants are in the same room.

The second aspect of this change has been the rise of auto transcription – the ability of a computer to track not only what a person is saying but determine in a room of speakers which person said what. Such transcriptions are not necessarily perfect, but they are hitting a bar of 99% accuracy that allows them to be understandable and actionable. The transcriptions keep track of not only what was said but when it was said.

As significantly, the transcripts so produced are machine-readable. By itself, this means that transcripts can be stored and indexed just as any other document can. It also means that semantic analysis can be performed on the transcript to determine intent at any given point in that meeting, meaning that you can pose questions such as “when was this project approved?”, “who made the decision and what were the reasons given?”, “when did this idea first appear?”.

This is especially important because audio and video streams are synchronous – you can scan forward and backward through such a stream, but searching video, even with this capability, is tedious and time-consuming. By indexing (and adding semantic tags either manually or through autoclassification algorithms), meeting video becomes asynchronous: every significant event can be accessed via a URL.

This last point is a powerful one. A meeting is an event. Events have historically been difficult to process, because they are, by definition, synchronous. Yet with the ability to make meetings asynchronous, you can in essence time travel by pulling together transcripts from the same time to get a bigger picture on what’s happening. Couple this with video image recognition (which is becoming increasingly sophisticated) applied to news feeds, security feeds, drone feeds, and so forth), and deep data emerges. This is my vision of where the metaverse is going – the ability to see the historical world unfolding as a whole fabric.

In Media Res,

Kurt Cagle
Community Editor
Data Science Central


Data Science Central Editorial Calendar: August 2022 

Every month, I’ll be updating this section with a lot of topics that I’m especially looking for in the coming month. These are more likely to be featured in our spotlight area. If you are interested in tackling one or more of these topics, we have the budget for dedicated articles. Please contact Kurt Cagle for details. 

  • Climate Change and Sustainability
  • An AI Snowstorm?
  • Personal Knowledge Graphs
  • Gato and GPT-3 
  • Labeled Property Graphs 
  • Future of Work 
  • Data Meshes 
  • ML Transformers 
  • Energy AI 
  • RTO vs WFH 

If you are interested in posting something else, that’s fine too, but these are areas that we believe are hot right now. 


DSC Featured Articles


Picture of the Week

DSC Weekly 05 July 2022: Standardizing a Metaverse

On-premise vs. cloud data warehouses