Subscribe to DSC Newsletter

Big Data Isn’t Analytics |

By John Sumser

In the 20th Century, progress was driven by the principles of automation. The more you could streamline and control a process, the cheaper you could make it. Repeatability was the key.

Monitoring, in the form of process control charts, spreadsheet generated graphics and staff meeting gotcha reports, was the way an enlightened executive ran his business. Progress was a matter of making the line move up and to the right. Cost cutting involved moving it down and to the right. Statistical process control (SPC) minimized variation.

Here’s a reasonable definition of SPC.

Statistical process control (SPC) is the application of statistical methods to the monitoring and control of a process to ensure that it operates at its full potential to produce conforming product. Under SPC, a process behaves predictably to produce as much conforming product as possible with the least possible waste. While SPC has been applied most frequently to controlling manufacturing lines, it applies equally well to any process with a measurable output.

Read full article at

Views: 728


You need to be a member of Data Science Central to add comments!

Join Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service