- Don’t try to put the cart before the horse: realize that efficient data preparation (and thus interoperable standards) and data quality, especially in the enterprise environment, are a basic requirement for all applications of artificial intelligence.
- The development of competences and experts in the field of artificial intelligence must take place at least parallel to the process of every technological decision, but not at the end of the implementation of an AI strategy. Outsourcing must not be part of this strategy.
- ‘Not to boil the ocean’, in other words: small, agile, consecutive pilot projects alone are not enough to develop an AI strategy. Parallel to the pilot phase, a more far-reaching strategy should be developed together with the management to promote cross-departmental, process-independent and data-driven decision-making and activities.
- Projects based on knowledge graphs are more multidisciplinary than many may think. Accordingly, teams must be developed that can cover expertise in the areas of database technologies, IT security, user experience, data visualization, knowledge modeling, data governance and compliance, etc. Accordingly, the specification and management of expectations at the beginning of the new initiatives is of utmost importance.
- Make the difference clear: Graph technologies are not just a slightly better search technology. Knowledge graphs can be used to address a large number of severe data management problems. The focus should be on it, right from the start!
The second wave of AI applications is rolling towards us. This is characterized by
- (even) more heterogeneous and messy data (including text) involved in the process
- the involvement of people who were disillusioned in the first phase of AI because the results were insufficient, and now rely on approaches that involve the expert in the process again
- the increasing understanding of how semantic knowledge models (e.g. knowledge graphs) can make a decisive difference in order to be able to realize intelligent, multi-modal applications even on a large scale
- the desire to be able to use structured and unstructured data at the same time and linked to each other
- the use of highly scalable graph database technologies, which can also be operated by non-technicians or subject matter experts through integration with semantic middleware components, visualization tools and editors
- the increasingly combined use of machine learning methods with graph technologies (e.g. graph embeddings)
- the increasing importance of ‘explainable AI’, which can be implemented with the help of semantic knowledge models, among other things.