Subscribe to DSC Newsletter

Data Interaction in Organizational Systems

I have always had a great interest in how businesses organize in order to get things done.  Here I raise some discussion points intended to stimulate debate.

Principle of Systemic Domains 

Not that long ago, I was completing a graduate degree in “critical” disability studies.  The critical part deserves to be in quotations since it is probably subject to interpretation and all sorts of misinterpretation.  I am going to suggest that in critical studies more generally, there is some emphasis on how certain things play a role in the formation of impacts and consequences.  I know this sentence seems rather awkward.  However, since the sentence is not my central point, I will just give an example to move matters along:  importing cheap clothing can contribute to the cultural and economic oppression of foreign workers.  So we have things that seem normal - cheap clothing - giving rise to deep social and structural problems.  It is possible to regard many things in critical terms although this is certainly not necessarily the only worthwhile perspective.  Critical studies occupy particular domains of discourse that tend to be excluded from day-to-day business discussions.  Environmental discourse is rather critical in nature.  When I did a degree in environmental studies, my thesis was on the effectiveness of public participation in local planning processes; so I was and continue to be concerned about the tangible and intangible impacts of institutional systems.

 

There are different domains.  I sat back one day and placed the schemas that I had developed side-by-side along with different business models, and I found an interesting pattern.  Assuming that the models exist in order to support production, there is a general tendency for the models to focus on particular domains of discourse.  There is no array of pigeon holes per se across the different domains, but consider the following structure as an explanation:  we have domains of perception to the left (inputs); domains of production near the middle (processes); and domains of impacts and consequences to the right (outputs).  In production, there is a flow from left to right.  I have just described a division that is already in the literature albeit in less developed terms.  The interesting part here is how differently the data is handled across the domains.  By and large, specific current events are most measurable near production.  On the consequential or critical side, there is the historical build-up of data making it possible to make assertions and inferences; broader historical patterns can be extracted.  On the perceptual or psychological side, this is where social learning occurs and where belief systems are sustained; there is a collection or more precisely a selection of external market data and internal organization data.  An organization that produces for its market also listens to it, or at least this is general idea.  On the other hand, the expression "What gets measured gets managed" is applicable to the internal.  One does not manage the market but one's participation in it.

 

Principles of Systemic Insulation and Feedback Migration

 

These principles are easiest to explain using an example.  I am sure that many are familiar with the concept of “quality control” in the workplace.  In a manufacturing environment, quality tends to be an issue of conformance; this is the idea that lack of sameness violates quality.  Therefore, quite close to production where events occur, there is often some level of quality control to ensure that outputs conform to particular standards.  Ideally, the checking generates a great deal of data.  Of course, quality control is or at least can be a much larger concern:  for instance, rather than simply assessing outputs, it is possible to measure the behaviours leading to production.  Let us say for the sake of argument that there is some opposition to the entire metrics paradigm, which I admit is quite a mindset; and certain aspects of the operation simply decided to stop collecting data.  What is not collected cannot be reported.  Only what is measured can be managed.  Is this necessarily a bad thing given the savings in terms of operational overhead?  “Systemic insulation” is not necessarily a problem unless something goes wrong further downstream the consumer fulfillment system.  Once something terrible happens, while it is true that there would be lack of incriminating data, there would also be lack of supporting data.  Lack of supporting data coupled with the fact that something significant went wrong means that there is more to presume criminality in a manner of speaking.

 

However, consider the dynamics using a reasonable scenario.  A car manufacturer decides to stop quality control protocols on its braking systems.  “Everything is just fine,” is the response followed by several nods and winks.  Sub-standard braking equipment is being installed due to a shortage of supply.  So nothing will necessarily stop such vehicles from making it out of the assembly line and onto the sales lots of dealerships.  As cars are purchased, there are reports of accidents.  Then there are law-suits, negative media coverage, followed by a sharp decline in future purchases.  While the issue of brake quality had been removed from the production domain where it could have been handled near production, the issue had shifted to the consequential domain in order to correct organizational perception.  “The further right the issue goes, the further left the system throws.”  (I thought that up this morning.)  One idea behind quality control then, phrased in these terms, is to minimize the extent to which problems reach right of production and lessen the resulting throw to the left of production.

 

The principles are not particularly complicated, and I feel there is quite a lot of supporting evidence.  Before moving on, as a tribute to my father who was a mechanic most of his life, I just want to mention that quality through sameness and strictness of process is often presumed but not necessarily true.  My father one day decided to modify a production machine in his area.  The change was immediately noticed by his superiors.  Management calculated that the modification if done to all of the machines in all the plants globally would save about $450,000 in raw materials annually.  With an estimate like that, it can be assumed that the company maintained excellent production records and statistics.  So I am not saying that conformity itself should be the objective of quality control; but rather more broadly, systemic insulation makes it difficult to determine whether or not change has been or would be useful.

 

Principle of Environmental Determinism

 

This is the idea that certain aspects of selection can occur to give some organizations advantages over others.  In a way, I suppose the concept is related to evolution.  However, considering the concept in geographic terms, there was a time when humans really struggled to survive in natural environments.  We have endured not because of physiological changes but developments in society and technology.  Now, consider determinism from a rather different perspective.  During good times, even the most poorly managed companies survive regardless of their systems, beliefs, and behaviours.  During deterministic times, what may have worked well before might cease to do so, leading to organizational extermination.  This means that when times are good, the data plays role in sustaining behaviours that might not necessarily be helpful when times become difficult; therefore the data as a bundle of capital might quickly deteriorate in value amid austere or radically changing conditions.  It is for this reason that reconstruction rather than simple realignment is often necessary when businesses decline; the issue is more about changing direction rather than moving to a different lane.  Within the context of environmental determinism, an organization during good times might be internally optimized to use data that is poorly connected to latent external risks.

 

As data scientists, I believe it can be tempting to regard the data as something constant describing external phenomena that are perpetual.  However, in reality, historical data can be as much a liability as an asset given the shifting determinants of consumer fulfillment.  The prefabricated solution is part of a mindset that should be avoided rather than placed on a pedestal.  Nonetheless, my place here is not so much to question the decisions made by organizations but rather the relevance of their data capital and therefore big data amid failing business normatives.  I am not saying that such normatives were necessarily incorrect or inappropriate at the time.  However, if there is a systemically insulated organization, failure to change in the face of deterministic conditions can only lead to only one evolutionary conclusion.

 

Principle of Organizational Awareness in Data

 

This principle is rather obscure, and I’m not certain if the brevity of a blog post really helps matters.  This is the idea that the nature of the data can itself cause an organization to become systemically insulated both in an internal sense (e.g. internal data from quality control) and externally (e.g. consumer sentiment).  For instance, sales data actually provides us with little information.  If people stop buying, the company can confirm this, although the exact reasons behind the change might be debatable.  In other words, there was never any intention for the data to hold such information, and so this is precisely the case.  In remember recently in the last university I attended, there was a shooting incident that seemed almost random in nature as per the media.  “Random” is a pretty bold word.  What the reports really meant is that there was no apparent target or reason for the shooting.  To say that an event is random implies that some situations can be predicted to lead to shootings.  “Well, yes, if this young man has a gun, and there is a fierce argument, the situation can be reasonably predicted to lead to a shooting.”  But really, there was no knowledge of a gun, an argument, or much of anything prior to the shooting; it seems more likely, minimal data was collected to suggest level of risk.  Risk is a complicated thing that is difficult to characterize in data.  Consequently, the data that is collected is “unaware” of the situation.  How can data be configured to become more aware?  I leave this question for readers to mull over.

 

So these are some general principles that I hope readers consider in their own examinations of data.  I welcome any feedback.

Views: 242

Tags: behaviour, costing, critical, data, determinism, disablement, environmentalism, information, modelling, organizational, More…risk, sensitivity, social, studies

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service