The BigObject® - A Computing Engine Designed for Big Data
BigObject® presents an in-place* computing approach, designed to solve the complexity of big data and compute on a real-time basis. The mission of the BigObject® is to deliver affordable computing power, enabling enterprises of all scales to interpret big data. With the advances in what a commodity machine can perform, it therefore brings the possibility of what-if analysis with big data, facilitating fact-based decision-making.
A 1,000x acceleration is not only about time efficiency, it’s about POSSIBILITY
One of the key success factors in the era of big data is velocity – it determines how people deal with data and how comprehensive the interpretation can be. In the era of big data, we are exposed to a variety of data sources and struggle to achieve insight by connecting billions of data records. This computing effort will end up in vain if it takes more time than people expect. If we cannot make decisions at the most opportune time, the data itself may become obsolete.
“We can achieve way more if we can cut experiment times, to say a 100 times shorter, then we can do 10 experiments per day! A single experiment now takes us 72 hours every time we change a parameter. You can imagine how progress in genomics can leap if we accelerate the proof of concept in DNA sequencing.”
The change in velocity can change the way we behave. With a more feasible approach decision makers can verify more what-if scenarios leading to better predictions.
“Every Friday when we are all sitting around the table to make sales forecasts, promotional strategies and logistics plans; it is a headache. We have over 200,000 SKUs and nearly 8,000 stores in total. Once a slight change in the sales volume of one single product in one single shop happens, the conclusion can be totally different. We, the sales team, need to recalculate the overall amount in each region, the product team needs to find out what promotion kits shall go and the supply chain team will need to re-route replenishment. Just a few sets of predictions take HOURS, making everyone crazy. This will be more pleasant if every adjustment takes as long as a sip of coffee.”
This is exactly the pain we are going to relieve. Every data entry and computation will be more timely, even in real time. People will be able to perform more trial-and-error simulations thus aiding their analysis and resulting in more accurate predictions and better profitability. Now you can rely on real-time analytic models to support your decisions rather than “the golden gut.”*
In-memory vs. In-place: Driving a Porsche in an alley vs. Driving a Tesla on the freeway!
There are numerous in-memory data discovery tools out there known for rapid calculations. In general, they exploit memory speed and hold the whole dataset in the RAM to eliminate slow disk access. As the data volume grows, however, users need to either scale-up more memory ranks to fulfill computations or confront a bottleneck where performance degrades drastically, not to mention a concurrency of multiple-users that will make this worse.
The principle of in-place technology, on the other hand, is to utilize 64bit address space – perceived as virtually infinite - to trade space for time. The major difference is that the BigObject® sends the code to the data, preventing latency caused by queries while existing in-memory technologies move the data to memory. Furthermore, data are loaded to memory on demand without swapping or juggling; the process time grows linearly along with data size while traditional in-memory performance may decline exponentially. With an in-place approach, there is no need to invest in additional hardware equipment such as memory ranks; a standard PC can become a powerful big data analytic machine as long as it is equipped with a 64bit processor. For the record, a laptop with 8G memory can compute 100 of millions of bits of data within 5 seconds.
Compared to other analytic tools, the BigObject® does not implement indexing to accelerate data discovery; in this case ad-hoc flexibility remains to execute instant computation in both read and write formats - which is unseen in other analytic tools.
Looking at the history of civilization, technology has always evolved based on the given conditions of a moment. The assumption which embraces the traditional computing model that heavily relies on data retrieval - resources are limited - faded away upon the introduction of 64bit architectures. That is, the enhancement of addressable space is changing the discipline of how we handle data and code, and in-place computing is the disruptive technology redefining this game. Nowadays we are overwhelmed by data; the challenge is to sort out hidden clues quickly enough for timely decision-making. When the speed leaps so much that we can find answers immediately, it changes our mindset. We see great potential in exploring a new territory of applications in business as well as science, which may trigger another industrial revolution with the implementation of in-place computing models.
The BigObject® package is free to download now. Please visit www.thebigobject.com for more information.
*In-place Computing: An unconventional computing model introduced by the BigObject®. It means that computations take place where data are stored.
*The Golden Gut: The term was mentioned by Thomas Davenport to describe that most people make decisions based on instincts and bold guessing.