The Case Against a Quick Win Approach to Predictive Analytics Projects

By Greta Roberts

When beginning a new predictive analytics project, the client often mentions the importance of a “quick win”.  It makes sense to think about delivering fast results, in a limited area, that excites important stakeholders and gains support and funding for more predictive projects.  A great goal.

It’s the implementation of the quick win in a predictive project that can be difficult.  There are at least 2 challenges with using a traditional quick win approach to predictive analytics projects.

Challenge #1: Predicting Something That Doesn’t Get Stakeholders Excited

Almost daily I hear of another predictive project that was limited in scope and allowed people to dip their toe in the predictive water and get a “quick win”.  The problem was the results of the project predicted something stakeholders didn’t care about or couldn’t take action on.

Examples include the following:

  • Predicting which Colleges and Universities yield the highest performers.

The problem with this quick win is that results of this prediction can lead to questions around – are they also the most expensive schools? Does only a certain economic class of person attend these schools.

Using these predictions opens up discussions of economic discrimination, making HR and executives nervous.  They often decide to ignore their newfound ability to predict performance, they don’t implement the prediction and the project doesn’t advance the case for more predictive projects.

  • Predicting a “Middle Measure” Like Engagement

The problem with this quick win?  While HR thought the project was a winner, project results got no excitement from business stakeholders and didn’t advance the goal of gaining additional support and resources for more predictive projects.

Executives have seen little or no correlation between engagement and actual business results at their own firm. Imagine trying to sell the VP of Sales on predicting engagement of their sales reps? At the end of the day their employees aren’t hired to be engaged, they are hired to do their job and sell.

Challenge #2. Quick Wins Shouldn’t Mean Tiny Data

In non-analytics projects you’re able to do a pilot with a small amount of people and data. You can focus on a small piece, a sample, something light, less expensive, less risky and less time consuming before you fully commit.

An example would be piloting a piece of software. You could install it for a small number of people and gain their feedback before making a broader commitment.  Pilots work great for small sample sizes and testing things with just a few people.

When you think about predictive analytics projects, though you want a “quick win”, you still need to find a project with enough data to conduct a great predictive experiment. To be predictive your models need to find patterns, and patterns require enough data.  It doesn’t make sense to do a predictive analytics pilot on tiny a bucket of data.

Rather than reducing the amount of data, we’d rather see you reduce the scope of the prediction.

An example: Instead of doing a predictive analytics pilot project to predict flight risk for all jobs in the Chicago office, maybe it would yield better results to keep the scope small and targeted by predicting flight risk for a single role that has a lot of people in it.

Ask your data scientist for their guidance on how to frame your “quick win” project to keep the project scope smaller, while giving the data scientist a reasonable amount of data to optimize your chance for success.

For your predictive projects, “quick” isn’t enough of a win. 

Instead, you want a quick, implementable and exciting win that people care about.

The only way to get a quick, exciting win is to start with a project that predicts something that either saves or makes money for your company.  Find a project that solves an existing business problem. Remember what predicting does for your organization. Accurate predictions do a better job at decision making so that you have better end results. End results are the only thing that will get people excited and will be implemented.

Think of banks that try to “predict” whether or not to give you a mortgage. They want to do a better job of extending credit only to people that can pay their mortgages. They’re not doing this to predict who will be engaged as a customer.

All your predictive projects should be ones where you are saving or making money. Do a project where you can demonstrate that your model worked and saved money on an important measure. Often this is a line of business problem, not an HR problem.

Results are the only kind of win that will get business stakeholders excited and move your efforts forward.

Greta Roberts is the CEO & co-founder of Talent Analytics, Corp., Chair of Predictive Analytics World for Workforce and Faculty member of the International Institute for Analytics. Follow her on twitter @gretaroberts.

Views: 1286

Tags: analytics, predictive, revenue, roberts, talent, turnover, wins


You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Mitchell A. Sanders on January 29, 2016 at 6:37am

Good to see someone discussing the very real bastardized cliche of the "Quick-Win". I would consider that the real problem may be the meme itself.

Of course we want fast, eye-popping, money-making insights (another questionable meme) of analytics (with visuals) to drive company business. The challenge is the concept business often has of what "quick" actually means - versus the realities of what it really takes: scoping and capturing the original data, munging and moving it onto other platforms to start exploratory analysis (let alone finding any patterns to model). And this all takes more time than "Quick-Win" allows. 

Managers think in terms of weeks ...maybe a month or two - not 6 months or a year -when talking about "Quick-Wins". Data scientists get pressure all the time and have to fight this hyped-up expectation of what data science is and can do. Execs read these amazing stories (How many times you heard the Target predicting a teen-age pregnancy story) and then they hire a qualified data scientist and the first thing they want are some "Quick-Wins". Sometimes mostly so they can politicize a victory up the chain-of-command.

What's not considered is that a whole team of people is often needed just to GET and transform the data. Access and data rights and chaotic warehouse data needs taken care of. Time and a platform of some sort needs developed or brought in (How much can you really do with an R install on your laptop?). And very talented data engineers and at least one scientist needs to apply skills that takes some time on just exploratory analysis just to get started.

And then the bad news. Often overlooked is that there is no guarantee that we WILL find the signal in the noise. Maybe the data doesn't yield great money-making insights or valuable predictive variables. OMG what then?! Execs and managers really aren't told this possibility.

The hype of Big Data along with Data Scientist has peaked - and it's starting to come back to haunt us all. "Quick-Win" is the first meme that needs a "Quick-Kill" if we are to bring smart value that the business can actually use.

Comment by Richard Ordowich on January 28, 2016 at 9:23am

A predictive analytic has a life span. Changing work flow processes based on a predictive analytic is usually a significant investment in resources. Verifying the benefits and value of a predictive analytic a year or two after it is adopted is seldom done. A negative ROI may result. 

In my experience, developing a predictive analytic model takes 1-2 years. That's not a "quick win". Its an investment. 

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service