Statistics is Dead – Long Live Data Science…

I keep hearing Data Scientists say that ‘Statistics is Dead’, and they even have big debates about it attended by the good and great of Data Science. Interestingly, there seem to be very few actual statisticians at these debates.

So why do Data Scientists think that stats is dead? Where does the notion that there is no longer any need for statistical analysis come from? And are they right?

Is statistics dead or is it just pining for the fjords?

I guess that really we should start at the beginning by asking the question ‘What Is Statistics?’.

Briefly, what makes statistics unique and a distinct branch of mathematics is that statistics is the study of the uncertainty of data.


So let’s look at this logically. If Data Scientists are correct (well, at least some of them) and statistics is dead, then either (1) we don’t need to quantify the uncertainty or (2) we have better tools than statistics to measure it.


Quantifying the Uncertainty in Data

Why would we no longer have any need to measure and control the uncertainty in our data?

Have we discovered some amazing new way of observing, collecting, collating and analysing our data that we no longer have uncertainty?

I don’t believe so and, as far as I can tell, with the explosion of data that we’re experiencing – the amount of data that currently exists doubles every 18 months – the level of uncertainty in data is on the increase.


So we must have better tools than statistics to quantify the uncertainty, then?

Well, no. It may be true that most statistical measures were developed decades ago when ‘Big Data’ just didn’t exist, and that the ‘old’ statistical tests often creak at the hinges when faced with enormous volumes of data, but there simply isn’t a better way of measuring uncertainty than with statistics – at least not yet, anyway.


So why is it that many Data Scientists are insistent that there is no place for statistics in the 21st Century?


Well, I guess if it’s not statistics that’s the problem, there must be something wrong with Data Science.


So let’s have a heated debate...


What is Data Science?

Nobody seems to be able to come up with a firm definition of what Data Science is.

Some believe that Data Science is just a sexed-up term for statistics, whilst others suggest that it is an alternative name for ‘Business Intelligence’. Some claim that Data Science is all about the creation of data products to be able to analyse the incredible amounts of data that we’re faced with.


I don’t disagree with any of these, but suggest that maybe all these definitions are a small part of a much bigger beast.

To get a better understanding of Data Science it might be easier to look at what Data Scientists do rather than what they are.


Data Science is all about extracting knowledge from data (I think just about everyone agrees with this very vague description), and it incorporates many diverse skills, such as mathematics, statistics, artificial intelligence, computer programming, visualisation, image analysis, and much more.

It is in the last bit, the ‘much more’ that I think defines a Data Scientist more than the previous bits. In my view, if you want to be an expert Data Scientist in Business, Medicine or Engineering then the biggest skill you’ll need will be in Business, Medicine or Engineering. Ally that with a combination of some/all of the other skills and you’ll be well on your way to being in great demand by the top dogs in your field.


In other words, if you want to call yourself a Data Scientist you really do need to be an expert in your field as well as having some of the other listed skills.


Are Computer Programmers Data Scientists?

On the other hand – as seems to be happening in Universities here in the UK and over the pond in the good old US of A – there are Data Science courses full of computer programmers that are learning how to handle data, use Hadoop and R, program in Python and plug their data into Artificial Neural Networks.

It seems that we’re creating a generation of Computer Programmers that, with the addition of a few extra tools on their CV, claim to be expert Data Scientists.


I think we’re in dangerous territory here.


It’s easy to learn how to use a few tools, but much much harder to use those tools intelligently to extract valuable, actionable information in a specialised field.

If you have little/no medical knowledge, how do you know which data outcomes are valuable?

If you’re not an expert in business, then how do you know which insights should be acted upon to make sound business decisions, and which should be ignored?


Plug-And-Play Data Analysis

This, to me, is the crux of the problem. Many of the current crop of Data Scientists – talented computer programmers though they may be – see Data Science as an exercise in plug-and-play.

Plug your dataset into tool A and you get some descriptions of your data. Plug it into tool B and you get a visualisation.

Want predictions? Great – just use tool C.


Statistics, though, seems to be lagging behind in the Data Science revolution. There aren’t nearly as many automated statistical tools as there are visualisation tools or predictive tools, so the Data Scientists have to actually do the statistics themselves.

And statistics is hard.

So they ask if it’s really, really necessary.

I mean, we’ve already got the answer, so why do we need to waste our time with stats?



So statistics gets relegated to such an extent that Data Scientists declare it dead.


Talk about the lunatics running the asylum…

What do you think?

Is statistics dead? Is there no place for statistics in data science or is it essential?

Join the debate below and let me know your thoughts...

About the Author

Lee Baker is an award-winning software creator with a passion for turning data into a story.

A proud Yorkshireman, he now lives by the sparkling shores of the East Coast of Scotland. Physicist, statistician and programmer, child of the flower-power psychedelic ‘60s, it’s amazing he turned out so normal!

Turning his back on a promising academic career to do something more satisfying, as the CEO and co-founder of Chi-Squared Innovations he now works double the hours for half the pay and 10 times the stress - but 100 times the fun!

PS - Don't forget to connect with me in Twitter: @eelrekab

This post has been published previously in Innovation Enterprise and LinkedIn Pulse

Views: 25660

Tags: data science, statistics


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by rusul issam mahdi on June 27, 2016 at 11:58am

 Actually both of  statistics and programming are working to gather to solve big data problems.

Comment by Lee Baker on June 27, 2016 at 5:24am


Thanks for the input.

For several years now I have concentrated more on programming stats rather than doing stats. The most obvious thing I realised is just how blunt a tool stats tests really are. I'm not saying that they're not useful, quite the contrary, but they're more like a machete than a laser scalpel.

You're correct - researchers get really excited when their p-value is 0.049 but terribly disappointed when p = 0.051. The difference is probably just a single sample.

Even RA Fisher himself (head bowed in reverance) often accepted p-values of 0.08 or 0.09 and rejected p-values of 0.04 or 0.03...

Comment by Andrew A. Kramer on June 27, 2016 at 4:31am

Here's a non-unique experience of mine. A data scientist that I know makes a lot of fuss when the Hosmer-Lemeshow ch-square is statistically significant for her model. But she's using 100,00+ patients to create her model. She also is excited when her AUROC is "improved" from .810 to .815, because the p-value is < 0.01. 

Without a good grounding in probability and statistics a data scientist is shooting themselves in the foot.

Comment by Lee Baker on June 27, 2016 at 12:54am


It is known that to be an expert in anything you need to have around 10,000 hours of experience. Since data science is a multi-disciple I suggest that to be an expert data scientist one would need around 10,000 hours in each of programming, data handling, statistics, AI, data viz, etc., which is why there are so few expert data scientists right now - there are few people that have put in those hours in more than 2 or 3 different sub-disciplines.

I'm writing a blog post about this issue right now, so keep an eye out for my next post - I hope you'll like it...

Comment by Lee Baker on June 27, 2016 at 12:49am


Thank you for your comments.

I agree that statistics is even more important in the Big Data Age, particularly in that databases are being constructed without underlying hypotheses thereby increasing uncertainty in the data and the results (more confounding and lurking variables, etc.).

In the history of mankind there's never been a more crucial time for statistics than right now!

Comment by Florens de Wit on June 27, 2016 at 12:30am

So that would make me a(n amateur) physics data scientist with a special interest in fire safety and forensics...

I think you make a good case for statistics not being dead; I have little to add there. Data science does run a risk of becoming nothing but a marketing buzzword if it keeps being used for "programming with a particular set of tools" rather than "using data and tools to help guide the way in a particular area of expertise".

Comment by Sione Palu on June 25, 2016 at 5:09pm

There are tons of research journals dedicated to statistics but I just checked out  SIAM's website to see if they themselves had stopped publishing papers in statistics. Voila!!! They still publish the "Theory of Probability & Its Applications".


Comment by Sione Palu on June 25, 2016 at 3:48pm

Statistics is alive & grown bigger than ever before with the age of big data.  Those that say statistics is dead & long live data-science are simply ignorant.

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service