Why is it so hard to change, especially from a business perspective? Why is it so hard to understand that new technology capabilities can enable new technology approaches? And nowhere is that more evident than in how companies are trying (and failing) to apply the old school “Big Bang” approach to Big Data Analytic projects.
The “Big Bang” approach likely comes from how businesses have historically gained competitive advantage – through “economies of scale.” Economies of scale refers to the concept that costs per unit of a product decline as there is an increase in total output of said product (see Figure 1).
Historically, large organizations have leveraged economies of scale to lock smaller players and new competitors out of markets (yes, we’re talking economics again…).
The economies of scale concept morphed into an economic concept called the Learning Curvewhich was employed by companies such as Texas Instruments and DuPont in the 1970’s. Like economies of scale, a learning curve exploits the relationship between cost and output over a defined period of time; that is, the more someone or an organization learns from the repetitive completion of a task, the more effective they become in the execution of that task.
Thepopular book “The Lean Startup” by Eric Ries expands upon the Learning Curve concept introducing the concept of incremental learning to become more efficient more quickly. “The Learn Startup” shares a story about stuffing 100 envelopes. From the story, we learn that the optimal way to stuff newsletters into envelopes turns out to be one at a time, versus folding all the newsletters first, and then stuffing all the newsletters into the envelopes, and then sealing all the envelopes and finally stamping all of them. The reason the “one at a time” or incremental learning approach is more effective is because:
From an economics perspective, we have to ask a very hard question:
Is “incremental learning” (and the immediate application of that incremental learning) more important than “economies of scale”?
“Big Bang” Big Data Analytics projects are poisonous (in my humble opinion) to the success of your Big Data and Digital Transformation journey. Organizations do not have to spend $20M+ on a 3-year Big Data Analytics project and then hope that they get what they thought they bought at the end of the project. And that’s not even considering how much the technology world can change during those 3 years (for example, Google’s open source TensorFlow machine learning framework didn’t even exist 3 years ago).
For a quick refresher on the dangers associated with the “Big Bang” approach, we only need to go back to the heyday of “Big Bang”…the days of ERP implementations.
History is littered with a litany of Big Data ERP projects that either didn’t deliver as promised, or just plain failed.
For example, what did a $400 million upgrade to Nike's supply chain and ERP systems get the world-renowned shoe- and athletic gear-maker? $100 million in lost sales, a 20% stock dip and a collection of class-action lawsuits.
And now into the world of Big Data and Analytics (or Artificial Intelligence, if you bit on that marketing worm) comes IT implementation strategies of yore.
“By all accounts, the [IBM Watson] electronic brain was never used to treat patients at M.D. Anderson. A University of Texas audit reported the product doesn’t work with Anderson’s new electronic medical records system, and the cancer center is now seeking bids to find a new contractor.”
The era where large organizations held economies of scale advantages over smaller organizations are over. Nowadays, it’s not larger companies that are winning, it’s the companies that are more nimble and responsive to rapidly-changing customer expectations and market dynamics.
“Business in the century ahead will be driven by economies of un-scale, in which the traditional competitive advantages of size are turned on their head. These developments have eroded the powerful inverse relationship between fixed costs and output that defined economies of scale. Now, small, unscaled companies can pursue niche markets and successfully challenge large companies that are weighed down by decades of investment in scale — in mass production, distribution, and marketing.”
The “Economies of Un-scale” are being driven by the following economic factors:
This is no need for organizations to commit to huge technology investments on the hope that you get what you were promised at the end. The “Economics of Un-scale” will reward those organizations that are more focused and nimbler; who take an incremental implementation in order to 1) accelerate time to value (by monetizing incremental learning) while 2) de-risking or mitigating technology investment risks.
The world of technology implementations is changing, and we need to ask the very hard economics question:
Is “incremental learning” more important than “economies of scale”?
Views: 1371
Tags: #BigData, #DataMonetization, #DataScience, #DigitalTransformation, #DigitalTwins, #Economics, #IIoT, #InternetOfThings, #IoT
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles
You need to be a member of Data Science Central to add comments!
Join Data Science Central