By John Sumser
In the 20th Century, progress was driven by the principles of automation. The more you could streamline and control a process, the cheaper you could make it. Repeatability was the key.
Monitoring, in the form of process control charts, spreadsheet generated graphics and staff meeting gotcha reports, was the way an enlightened executive ran his business. Progress was a matter of making the line move up and to the right. Cost cutting involved moving it down and to the right. Statistical process control (SPC) minimized variation.
Here’s a reasonable definition of SPC.
Statistical process control (SPC) is the application of statistical methods to the monitoring and control of a process to ensure that it operates at its full potential to produce conforming product. Under SPC, a process behaves predictably to produce as much conforming product as possible with the least possible waste. While SPC has been applied most frequently to controlling manufacturing lines, it applies equally well to any process with a measurable output.
Read full article at http://www.hrexaminer.com/big-data-isnt-analytics
Posted 1 March 2021
© 2021 TechTarget, Inc.
Powered by
Badges | Report an Issue | Privacy Policy | Terms of Service
Most Popular Content on DSC
To not miss this type of content in the future, subscribe to our newsletter.
Other popular resources
Archives: 2008-2014 | 2015-2016 | 2017-2019 | Book 1 | Book 2 | More
Most popular articles
You need to be a member of Data Science Central to add comments!
Join Data Science Central