Will Big Data Make Data Scientists Redundant?

A recent report predicts that a 100 million jobs could be at risk of being made redundant by automation.  These are expected to include roles from supermarket cashier to accountant, as robotics and machine learning become increasingly able to perform routine tasks more efficiently than we can.

But surely if there is one job title which will be safe from the inevitable rise of the machines it will be data scientist or big data analyst – after all haven’t I been championing that career path as one with a bright future for some time now?

And computers will always need a human to tell them what to do at some point, right? Sure a computer can learn, but won’t it always need a human to teach it how - and tell it what - to learn?

Well, recent developments do indeed seem to be indicating that computers may be getting better at some of the analytical functions which human input has, until recently, been necessary for.

The aim of Google’s Automatic Statistician is to create “an artificial intelligence for data science”. Specifically, it is creating software algorithms that can spot patterns, and report them in simple, easy-to-understand text. For example “The data shows that Saturdays were consistently warmer than Sundays throughout the year and this correlates to higher turnout at outdoor events”.

As well as humans, machines can also interpret these results – and use it as the basis for further analysis, by automatically selecting appropriate models and predictions to test it against.

The program was developed by a team of scientists at Cambridge University collaborating with others at MIT, who earlier this month were awarded $750,000 from Google’s Focus Reward Program to further their research. 

I have often said that humans are still better than machines at recognising patterns – well this could mark the point where that starts to change.

“Statisticians, you’re next”, predicted one person on the somewhat pessimistic Google+ discussion group Technological Unemployment.

After all at the start of the industrial revolution, weavers rioted when their jobs were threatened by newly-developed power looms, which allowed one worker to do the job of many with the help of machinery. 

So, am I now ready to stop recommending data analysis as a career path and suggest everyone starts training as something else instead? Well, of course not! No career is future proof, and data analysis (by humans) will be a fast growing field for a long time yet.

What this is likely to do (as is often the case with automation) is free up analysts from a lot of the more routine side of the work. With machines potentially writing their own tests and selecting their own relevant data, a lot more data analysis is going to be happening.

Operational data scientists would spend far less time getting their hands dirty with low-level modelling and simulation, and more time overseeing the output of hundreds of experiments being carried out automatically by AI analysts.

And strategic data scientists, who are used to considering the “big picture”, will find that the picture has grown considerably, if they are able to count on the support of reliable automatic analytics algorithms, and all the additional data they could yield. The horizon will be greatly widened, in terms of top-level strategies available.

It is very early days yet for this kind of technology, but as with the whole field of machine learning, it is moving very quickly – spurred by big bucks investments such as Google’s, from companies which understand the vast benefits that being first to use methods like this could bring them.

At the moment, the Artificial Statistician works with manually-fed data which is selected by a human. There’s no reason, though, that one day it won’t be fully online and have all of the data of the internet at its disposal. Teaching it the best places to look if it is looking for statistics to help with analysis on a particular subject will be an important step at some point along the line.

In reality, it will be a long, long time (if ever) that a computer can completely fill the world’s need for statisticians and analysts. It probably won’t happen unless we reach the point that science fiction from Matrix to Terminator has warned us of, and machines actually take over the world!

The point of analytics will always be to make things more efficient, easier or effective – for humans. This means that we will always be the ones who select the ultimate questions that we want answered, and make a decision about which of the answers we receive is “right”.

In my opinion this development opens up far more doors to opportunity than it closes for wily and adaptable analysts.

Remember the weavers I mentioned earlier? Well they weren’t all destitute following the invention of the power loom. Many of them saw that as the industry exploded in size following industrialization, there were still plenty of jobs. They were just different jobs than before. Many went on to become hugely successful by operating their own mechanical enterprises, backed up with the knowledge of the traditional trade they had practiced their whole lives. Try to follow the example of these adaptable individuals, rather than those who saw only the slippery slope into redundancy.

I hope you found this post useful. I am always keen to hear your views on the topic and invite you to comment with any thoughts you might have.

About : Bernard Marr is a globally recognized expert in strategic metrics and data. He helps companies manage, measure, analyze and improve performance.

His new book is: Big Data: Using Smart Big Data, Analytics and Metrics To Make Bette...

DSC Resources

Additional Reading

Follow us on Twitter: @DataScienceCtrl | @AnalyticBridge

Views: 3982


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Matthew Napleton on February 16, 2015 at 1:08am

Why have we created these data science roles anyway? Surely we should be making technology easier to use and data easier to get hold of. That’s the only way to drive innovation. The rise of Database Administrators (like the terminator reference) and the power they wield over major businesses has often been a thorn in the side of data innovation – so why are we creating another paradigm in complexity? Let’s make it easier for everyone – it is Big data after all.

Matthew Napleton, Marketing Director, Zizo


Comment by Sione Palu on February 4, 2015 at 10:30am

It won't happen anytime soon. I have seen academic researches from the late 1990s where they developed computer programs to do programming on its own. I talked with some work colleagues at the time when we became aware of such researches, that if such technology matured, then we would all be out of a job. Now, such system hasn't arrived in the market yet & perhaps not anytime soon but in the next few decades I guess, so computer programmer will be around for a long long time before they're made redundant when such technology became available.

I think that data scientists who are doing R&D (academic wise) will still be around. They are the ones who will drive to improve such automated analytics system because there's always new sophisticated algorithms to be invented.

Comment by Peter Mancini on February 4, 2015 at 8:13am

I welcome any tools like this. Our future is to embrace man and machine cooperation. The freestyle chess movement has shown quite clearly that just people teams and just machine teams are beaten regularly by man and machine teams. We still have miners and mining and its a lot safer with machines than it was when John Henry decided to fight the steam drill. No one ever talks about the ordinary man driving the drill. The most ethical value of technology is the enhancement of humanity. I see a tool like this as greatly enhancing the amount of work we can get done. The machine needs us because we are creative and can form scientific thoughts. 

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service