Subscribe to DSC Newsletter

Towards Artificial General Intelligence in Enterprise – Data Science Driven by Statistics Requires New Qualitative Analytics to Model Disruptive Changes

Who’s this article for:

This blog is intended for enterprise data analysts, line of business users, and data practitioners who work with qualitative and quantitative data in decision-making.

How enterprise currently use data science and business intelligence today

Quantitative analytics based on statistical models predict outcome with data models built from historical datasets using machine-learning algorithms 

  • Enterprise managers use it as a guide to drive business execution
  • Models are updated from time to time to reflect the latest business dynamics
  • Input to these machine learning algorithms are comprised of voluminous rows of known dependent variables and target variables 
  • For qualitative data, domain-specific ontology, dictionary, and mapping are required
  • Models are generally tested for accuracy by randomly splitting input dataset into two parts, one for training and the other for testing
  • After a model is created by the training algorithm, the test dataset is applied for the purpose of accuracy verification

The problem

Even after a model has been tested for production deployment, problems can arise when data models fail to reflect the real-world business model. The difference(s) between a statistical model and the real world may be a result of several factors. Businesses that rely on yesterday’s predictive models are likely to produce inaccurate predictions and business decisions that can subject organizations to risks and losses in the event of a disruptive change. Models need timely updates in order to reflect new disruptive changes. Without the benefit of knowing what those disruptive changes are, however, enterprise data science teams are left searching in the dark for solutions. 

The following is a partial list of disruptive factors that can impact business execution and strategy: 

  1. Competitors adopting new or disruptive business models – e.g. online retail
  2. New government regulations – e.g. change in tax law
  3. Change in consumer preferences and behaviors – e.g. social network, generic vs. brand
  4. Geopolitical events – e.g. interest rate changes from global central banks
  5. Disruptive climate events – e.g. an earthquake
  6. Disruptive changes in supply chain – e.g. a hurricane, U.N. sanctions

Statistical solutions use inductive rationale to predict outcome based on voluminous historical data. Logical conclusions are drawn from many observations of similar scenarios. Disruptive changes require deductive and abductive rationale to find solutions on new facts when historical data on such facts does not exists. This is where quantitative analytics meets qualitative analytics. 

The Solution

We propose a comprehensive solution that would enable enterprises to use Artificial General Intelligence (AGI) to discover new relevant subjects, so as to discover and augment existing quantitative analysis with new or previously unknown domain variables. Independently, we can create a deductive reasoning model for new business conditions. The use of AGI is to break the barrier of prevailing AI. To do this, it must be able to handle previously unknown facts and concepts. Through the abstraction of all relevant facts from business intelligence, enterprise managers should be able to take all scenarios into consideration.

 

Implementation of an AGI solution has been limited by machine learning model that requires large number of rows of training data and processing time. It is not timely for deployment for newly discovered facts. The real-time aspect of tactical and strategic execution requires new analytic methods on qualitative data.Meta Vision Analysis and Bionic Fusion Analysis enable enterprise organizations to transcend this limitation and dynamically discover relevant domain variables, drawing insights for real-time execution by “connecting the dots”. The following is a diagram that depicts this workflow.

Figure 1

The green color in Figure 1 depicts the processes that participate in the qualitative analytics workflow. By qualitative analytics, I mean the kind of analytics that are capable of discovering both known and previously unknown symbols, terms, words, and acronyms. These analytics create abstractions from new facts and put context into perspective. In doing so, business strategies can be deduced. For strategies that involve new concepts and business terms, we do not expect historical data exists that could shed light on such outcomes. However, through deductive reasoning, one can evaluate qualitative outcomes based on business intelligence on others who have deployed similar strategies, thus assuming a calculated risk by injecting these ideas to improve existing business models. 

Conclusion

This solution is made possible by the availability of our CIF technology. It is a novel technology  that is capable of learning from text documents on the fly to discover new ideas, subjects, names and acronyms and draw relations between contexts regardless of the size and volume of text. It is domain agnostic and does not require data dictionary, ontology, or prior machine learning on domain specific themes. The results have been promising. We have documented the deployment of such solution on the earnings calls of several public companies. 

Views: 606

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Follow Us

Videos

  • Add Videos
  • View All

Resources

© 2017   Data Science Central   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service