Summary: Purpose Built Analytic Modules (PBAMs) such as those for Fraud Detection represent a fourth way to practice data science, a new model for the good use of Citizen Data Scientists, and a new market for AI-first companies.
It appears that data science has exited its age of exploration and entered into its…
Added by William Vorhies on September 18, 2018 at 9:07am — No Comments
Originally published on the Aster Community. We invite you to register for our upcoming webinar “Bridging the Gap Between Data Scientists and Analyst with Analytic Solutions” with Brandon Purcell of Forrester…
ContinueAdded by Ryan Garrett on May 26, 2016 at 6:47am — No Comments
Summary: The shortage of data scientists is driving a growing number of developers to fully Automated Predictive Analytic platforms. Some of these offer true One-Click Data-In-Model-Out capability, playing to Citizen Data Scientists with limited or no data science expertise. Who are these players and what does it mean for the profession of data science?
…
ContinueAdded by William Vorhies on April 13, 2016 at 8:30am — 8 Comments
Above is a distribution of price differentials for the Dow Jones Industrial Average from the 1930s. The image was generated by one of my programs called Storm. I posted a few images from the same application in other blogs. If I recall correctly, the more volatile differentials (closer to the action) are at top; the more stable differentials (further from the…
ContinueAdded by Don Philip Faithful on May 24, 2014 at 6:51am — No Comments
Data science might be one of the hottest buzzwords in 2013. But is it only a marketing gimmick? I don’t think so. In my opinion, data science can be the best protocol that reveals what’s happening every day in the real world.
The data science incorporates mathematics, statistics, computer science and…
ContinueAdded by Yuanjen Chen on February 6, 2014 at 6:30pm — No Comments
The BigObject® - A Computing Engine Designed for Big Data
BigObject® presents an in-place* computing approach, designed to solve the complexity of big data and compute on a real-time basis. The mission of the BigObject® is to deliver affordable computing power, enabling enterprises of all scales to interpret big data. With the advances in what a commodity machine can perform, it…
ContinueAdded by Yuanjen Chen on November 20, 2013 at 5:29pm — No Comments
We have been using tables in the relational database, mostly for the transactional purposes, and that proves effective. Considering the data size and analytic purpose, however, the data structure might need to be redesigned for better efficiency.
To determine how to decompose the complexity of big data, we have observed the way the organisms function. In the physical world, the universe is organized into a hierarchy of…
ContinueAdded by Yuanjen Chen on November 3, 2013 at 10:29pm — No Comments
In general, computer scientists treats code and data in two very different ways. Virtual memory was originally developed to run big programs (code) in small memory, while data are entities kept in external storage and must be retrieved into memory before computing. As a result, today’s application developers think by instinct the programming model based on storage and explicit data retrieval. This model, referred to as storage-based computing, plays an important role and has done a great job…
ContinueAdded by Yuanjen Chen on October 31, 2013 at 7:24pm — No Comments
To be short, in-memory computing takes advantage of physical memory, which is expected to process data much faster than disk. In-place, on the other hand, fully utilizes the address space of 64bit architecture. Both are gifts from the modern computer science; both are essences of the BigObject.
In-place computing only becomes possible upon the introduction of 64bit architecture, whose address space is big enough to hold the entire data set for most of cases we are dealing with today.…
ContinueAdded by Yuanjen Chen on October 29, 2013 at 1:00am — No Comments
Hi all,
This is my first post here. I'm glad to introduce this newly launched big data analytic engine, the BigObject. In the past 2 years we have been working on an optimal approach to handle big data for analytic purposes and challenging the existed models, some assumptions of which are no longer valid. For example, as the data size grows so rapidly, is it still practical that we stick to the relational models neglecting the time spending in data retrievals? What impact did…
ContinueAdded by Yuanjen Chen on October 23, 2013 at 11:30pm — 2 Comments
The ‘Bell curve’ or the ‘Gaussian bell curve’ is one of the fundamental concepts on which most of the statistical analysis is based. From social sciences to astronomy to financial services- most of the application of statistics in the real world relies on the assumption that the data being analysed is distributed in the shape of the bell…
ContinueAdded by Gaurav Vohra on January 15, 2013 at 12:29am — No Comments
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
1999
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles