Subscribe to DSC Newsletter

For my graduate paper, I studied perceptions of workplace stress through the critical lens of social disablement.  Writing this paper was certainly an intellectual exercise that at the time didn’t seem to have many practical applications.  I am therefore honoured to become better acquainted with the “mechanics” of quantitative alienation through my day-to-day duties.  I respect the fact that I can’t share any substantial details about my actual work processes on a blog.  It will therefore be necessary for me to focus on concepts and to keep the conversation at this conceptual level.  I will admit however that much of this blog is about a case that I studied during my graduate studies - about 30 years of archived management records relating to an organization that is no longer in operation - gained through “freedom of information” provisions.

I have already encountered a few interpretations of the meaning of “prescriptive data,” which is certainly important for me to know in order to prevent misunderstanding with readers.  When processes are “prescribed” - that is to say, not open to personal discretion but rather spelled out in specific details - I describe those processes as prescriptive.  Within this framework therefore, if one were to deviate from those prescribed processes or normatives, it is the role of the data to point to the level of deviation.  Prescriptive data might therefore be generated by operations such as quality control and compliance.  In older companies where processes might be deeply entrenched, many functions including human resources might seem unusually bureaucratic.  I consider bureaucracy indicative of a prescriptive environment.  There might be many reasons why these controls exist in the workplace.  I don’t question the reasons but merely take note of their presence.

When data is prescriptive, it is part of the control apparatus.  Employees might not consider the endless flow of performance metrics always reasonable.  Yet there has to be some kind of standard to measure people that is grounded in the needs of the organization.  There is an assumption here that the organization is well aware of its needs and how individual employee behaviours contribute to them.  In the context of “quality control,” the assessment isn’t far from the metrics; and the metrics aren’t far from the requirements.  By this I mean that, for example, it is clearly necessary to ensure that bottles containing 200 pills of a particular drug indeed contain precisely 200 pills of that drug, that the bottles are sealed, labeled, shipped to the proper storage area, kept in suitable storage conditions.  The resulting metrics make it possible to ensure that processes give rise to control over quality.  Challenges start to emerge when there is an attempt to make use of metrics for “business management.”

The systematic use of data for business management strikes me as a tectonic shift.  The shift started long ago with the emergence of scientific management.  But even in my lifetime, I would say that the majority of my managers were “people persons” more than professionals who engaged data.  It is with considerable disdain that I once described scientific managers as “micromanagers” - managers that tried to manage every aspect of everything that people did.  But then I came to realize that these days, scientific managers don’t actually manage people directly but rather the data that they generate.  Workers are managed indirectly, at times without any direct contact, which is a way of doing things made possible through technology.  A manager can literally be involved in managing an operation while being situated nowhere near those operations.

Moreover, the manager for her or his part is increasingly judged by their ability to effectively respond to the data, to keep the numbers healthy, to meet targets connected to the data, to demonstrate that they are fully engaged with the data.  Precisely like the employee, the manager is also subject to quantitative alienation; in effect, the aggregate metrics are used to judge the abilities of the manager.  I must admit, recalling of the literature I reviewed examining workplace stress, a number of studies focused on the stress of managers.  Even within the archived records, there were programs to help managers deal with stress.  It is easy to appreciate how poor numbers might bring about demands that the manager might not know how to handle - or indeed might be beyond the manager’s control.  An important skill is to be able to explain the extent to which a metric falls outside the scope of an individual to control; this is not a skill that all employees or managers possess - since it is quite technical.

It is a different kind of regime for a different kind of manager - and employee.  I consider the potential for faulty construction or design extraordinarily high: i.e. metrics might be designed to encourage behaviours that actually don’t serve any useful purpose.  On the contrary, the metrics might create needless animosity and distrust.  By not serving “any useful purpose,” I mean literally the behaviours could reduce profitability and further the organization’s struggles.  It is quite easy to bring about faulty processes:  “The documents must be printed and stored in these boxes.  The boxes must be dated.  The boxes must then be stored with its location indexed on this sheet.  To access these boxes in the future, a proper requisition must be submitted stating the location.”  An entire metrics regime can be built around the management of printed records, essentially concealing the fact that the underlying behaviour in many cases is kind of foolish.

I am not like some writers whose conceptualization of “alienation” is purely a topic of personal struggle or individual consequence.  In a literal sense, quantitative alienation can trigger unproductive and potentially harmful employee, management, and organizational behaviours as people and systems align in an attempt to churn out the right numbers.  And as the right numbers are churned out, if the organization nonetheless finds itself in the gutter, it would seem that all of that deliberation over design was quasi-intellectual nonsense - from people incapable of even recognizing the destructiveness of their decisions and behaviours.  This is the essence of quantitative alienation from an organizational standpoint: it represents a kind of pathology similar to alcoholism.  The organization becomes disassociated from reality; and it begins working not on the basis of reality but rather its fabrication or illusion.  Perhaps one of the most frightening roles a data scientist can assume is guiding a company out of this darkness back into the light.

Views: 476

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Comment by Stanley Forrester on November 11, 2017 at 9:59pm

I have experienced this Quantitative Alienation, but the question, and yes I realize the irony of the question, is how to measure, Quantitative Alienation?

In physics, there is saying, "What we are measuring is not really what we are measuring." By understanding the difference between the instrumental readouts and the actual physical phenomena in which we are interested we can hope to be able to detect artifacts in the data.  

So how is this for a zeroth approximation to a definition.  Data is "Alienating" if it measures compliance to procedure rather than the quality of the output.   

Follow Us

Videos

  • Add Videos
  • View All

Resources

© 2017   Data Science Central   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service