“Data is the new water” – powering many organizations to turn on the tap for digital transformation. To enable data driven processes, it is imperative to reveal the treasure trove of data hidden away in existing SAP applications. Extracting data from those applications (the E in ETL) is commonly the biggest pain point of data engineering where it is tough to balance performance versus load in a legacy system. At the same time customers are demanding more: nightly batch processing, drip feeding data, cannot maintain the demands for real time information streams.
Breuninger, the luxury high-end retailer, was facing these similar challenges and decided to utilize change data capture (CDC) technology to build data pipelines that constantly supply a flow of data in real-time.
Based on their team’s experience, you will learn how to integrate data from mission critical SAP application into a modern cloud-based analytics warehouse with Google BigQuery.
In this latest Data Science Central webinar, you will hear how Breuninger processes an impressive amount of valuable SAP data to the cloud and learn:
How they determined the parameters for change.
Why CDC is a game changer for improving customer experiences.
How they consolidated all SAP data into a single source of truth
When Google BigQuery can deliver immediate cost savings.
Find out how your organization can keep your data flowing in the cloud, stay ahead of changing demands, while lowering over-reliance on SAP resources and saving costs.
Adam Mayer, Senior Director Product Marketing – Qlik
Matthias Krenzel, Head of Data Platform Services – Breuninger
Jessica Tischbierek, Customer Engineer Technology Practice SAP – Google
Sean Welch, Host and Producer – Data Science Central