Home » Uncategorized

Gaining the Context from the Deconstruction of Metrics

I find myself habitually using the term “metrics.” When I first started blogging, I normally used this term only in reference to performance metrics. These are not ordinary “readings” but rather criteria-driven amounts – the criteria being performance. Over the years I have come to recognize that data-gathering is normally premised on some type of criteria. When compiling revenue data, it should be noted that analysts are seeking out data pertaining to revenues. The quest is predefined. The outcomes are predictable. The numbers gathered aren’t so much a reflection of any underlying phenomena but rather the external requirements of analysts. These are the elements of reality that statisticians handle as part of their craft: the reality is decontextualized. It is possible to determine revenues but not the dynamics surrounding the decisions to purchase. This is not a weakness of statistics but rather its intentional design. An overemphasis on metrics is therefore problematic in that it might leave much of reality overlooked; and the specific part or parts under examination might barely reflect the overall truths of the situation. This tendency is certainly not exclusive to statisticians.

I blame an emphasis on metrics without context for a prescription that I received a number of months ago. The medication effective does what it is meant to do. But it also makes it nearly impossible for me to breathe. It drastically reduces my mobility – both my ability to walk and drive. It triggers a reaction where I am choking on mucous. The coughing was so extreme I was on the floor at times trying to recover.  My ability to speak was greatly impaired. To me, these are indications of a severe medication allergy. In effect, the medication has the ability to transform me to a ward of the state.  I have been laughing since getting off this medication – literally laughing more frequently.  I am still on medication all of which seem fairly beneficial.  It is important to recognize how all of this blogging I have done in relation to the “lived experience” has been about adding context to data. This lived experience is a type of data in itself albeit not one that people have been able to easily exploit. This context seems mostly imperceptible using traditional quantitative methods. If one’s objective were to invest in health care, pouring resources into a system preoccupied with superficial metrics might on one hand not improve conditions enough to justify the investment; on the other hand, the approaches taken might actually worsen the situation for people due to the failure to take into account context.

When confronting decontextualized data, those responsible for analyzing the data might attempt to predict future values. If the future can be premised on no real understanding of context, then these analysts might be successful. If branches of knowledge could be labeled as unimportant and discarded, it would be quite an amazing accomplishment to foretell the future based on what is left – the metrics. To thereafter claim that this is scientific and to be believed, the spell or elixir of enchantment must indeed be potent. Such analysts seem to ignore what I consider to be important – understanding how events and incidents led up to the data. When the supercomputer in Douglas Adam’s Hitchhiker’s Guide to the Galaxy expressed the answer to the universe as a number, the main characters commenced their journey to find the question to the universe. The story moves forward in time, which I admit is an unavoidable perspective for humans given that our existence unfolds as time moves forward. However, a perfectly reasonable approach is to try to deconstruct the answer to the universe in order to piece together the question. The deconstruction of metrics brings about the need to add context to data. In this blog, I will be discussing some initial steps in the deconstruction process.

Below is an application that I created. I call it Elmira. Elmira has a number of interesting capabilities; but the one most pertinent to my discussion here is the connection of multiple metrics to multifarious events. As an example, I will use only a single metric and only one type of event; and they happen to be the same. I ask readers to go along with me at this early stage as I build momentum. I realize that the sentence doesn’t quite make sense. I hope that it will shortly. The metric is my heart rate. Moreover, the type of event is my heart rate. Elmira scores events by their apparent relationship to metrics using my Crosswave Differential Algorithm. It therefore stands to reason that if the algorithm works – and if events are the same as the metrics – the correlation between the events and metrics should be exceptionally strong. I have documented the nature of this algorithm on other blogs; and those that would like more detailed information about it should please refer to my past posts. For now, I would like readers to “infer” that the algorithm works by the strong association it has made between the metric and the events pertaining to my heart rate.

Gaining the Context from the Deconstruction of Metrics

Normally the only time I create the metric-event redundancy is for diagnostic purposes. Below I present the results using the differential scores (along the y-axis) and my heart rates (along the x-axis). The scatter is often more linear. I believe that inadequate data can make the scatter jumpy as in the case below. Recall that for diagnostic purposes, I made the events mirror the metrics. However, there is no other reason to do this. Instead of attaching heart rate events to heart rate metrics, I can attach all sorts of events to many different metrics. Events for example can include the following: trips to the grocery store; pounds of beef eaten; number of hours watching television; employee performance; sales and revenues. Whose heart rate wouldn’t go up if revenues plummeted? This methodology lays the groundwork to investigate the extent to which different phenomena interact with metrics.

Gaining the Context from the Deconstruction of Metrics

I recently decided not to market Elmira. But I am happy to write and share information about this project. The pressure of trying to fit a creative solution into a commercial model of product development doesn’t work for me. In any event, consider a situation where all of the events on my database have been sorted by Elmira from highest to lowest differential. Then I add filters and flags to locate and highlight events that are thematic in nature. For example, below I make use of a filter to highlight types of Vitamin D consumption. How many types of Vitamin D consumption can there be? Well, in my case these days, I at least record “different times” consumed. Notice how the highlighted dots skew to the right where the differentials are lower. I am not saying Vitamin D causes heart rates to increase. I am saying that “my” consumption of Vitamin D seems associated with “my” elevated heart rate. I would probably respond by reducing my consumption by reducing the amount taken and the frequency.

Gaining the Context from the Deconstruction of Metrics

While a great deal of emphasis might be placed on prediction, I believe that in some cases organizations are more interested in deconstruction: that is to say, explaining how the metrics got to be the way they are. Using Elmira, I would attempt to “throw events” in order to embody or add body to metrics. I consider the Crosswave Differential Algorithm semi-statistical; although it probably doesn’t resemble any statistical method that has been previously published. I make no attempt to infer from aggregate results of a single sense; in fact, the more diverse the sense, the more compelling the bias. There is also no need to reduce to a common unit such as dollars. Elmira can manage a person’s individual data. I would say that a lot of reflection and thought are necessary to deconstruct metrics. One would need a strong sense of ontology – a refusal to unquestioningly accept labels and predefinitions. I find that I obtain useful insights more rapidly and continuously than, say, lumbering through life testing hypotheses. I don’t normally test hypotheses – not really. I conceptualize. I throw a barrage of events. I learn. I reconceptualize. I throw another barrage. It is like particle physics. One learns by smashing, sipping coffee, and reflecting on the debris.