Home

Technical Deconstruction

The term “technical analysis” usually refers to the study of stock prices.  A technical analyst might use real-time or closing prices of stocks to predict future prices.  This is an interesting concept because of what is normally excluded from the analysis – namely, everything except prices.  Given that the approach doesn’t necessarily consider the health or profitability of the underlying companies, a purely technical approach seems to offer guidance that is disconnected from reality.  Yet a technical perspective is certainly worth considering – especially on the absence of more meaningful data.  (I do not mean to diminish the need to gather more meaningful data.  Sometimes time and resources force the use of fundamentally superficial metrics.)

Some analysts seem to naturally gravitate towards a technical style of analysis.  They might use sales data in an attempt to predict future sales.  They accept the reality that sales data (an outcome) doesn’t actually explain why clients decide to buy (an income).  They proceed with the analysis anyways because details pertaining to the decisions of clients are likely unavailable or difficult to obtain.  They concentrate on the technicals.  In this blog, I will be augmenting the technical approach – using a technique that I call “technical deconstruction.”  This is a process of breaking down technical data for more detailed analysis.

To enable deconstruction, it is necessary to confront with what went towards the data’s formation – that is to say, how it came into existence.  Deconstruction might therefore be described as an ontological exercise.  Below, I created a chart from fictitious data representing many weeks of production.  Without deconstruction, an analyst might suggest that the pattern although volatile indicates a trend that is relatively flat.  There might be a long-term increase; but this could easily disappear if more data is added.  The company is going nowhere.  The analyst submits his or her findings to management.  The managers are unimpressed because they can’t do anything with the findings.  The analyst could be replaced by a cheap statistics package.  Let us attempt to deconstruct the data.

Technical Deconstruction

The data cycles from Monday to Friday.  Consequently, one way to deconstruct the data is by separating it by workday.  How does a person separate data by the day?  Assuming the source data contains a column for the date, the supporting function is built into a number of spreadsheet applications.  Conducting this separation for the first time might feel a bit peculiar – as if it is wrong to do so.  But if there are indeed similarities by day, a person who starts analyzing by day might find it difficult to turn back – reverting to all the days of the week.  This will certainly be the case here.  The activity on Monday is clearly dissimilar to Friday.  It therefore makes little sense to analyze Friday using data that includes Monday.

Below I present the same data broken by day of week.  I personally consider the new chart a bit difficult to interpret – if all of the days are present – if the interpreter is human rather than a machine.  (It might be easier to study the days individually.)  Nonetheless, it should be evident that the individual days have some different attributes.  (For example, production on Monday is much greater than Thursday or Friday.)  Each day seems to occupy a band or stratum.

Technical Deconstruction

In order to better determine the general trend for each day, I added trendlines to the chart while deleting the underlying production patterns.  (I generated the production data using controls.  Of the days, the “most random” is Thursday.  I intentionally designed the patterns for Monday and Tuesday to increase – for Wednesday and Friday to decline.)  With the initial production data deconstructed, it would seem that there is something positive about Monday and Tuesday – and possibly also Thursday.  How might the deconstruction process continue in order to bring to light the contributing factors leading to superior results on these days?

Technical Deconstruction

Business expertise is often mentioned as a desirable asset among data professionals.  I find that there tends to be little elaboration on the exact reasons why.  I suggest that the term “expertise” is routinely used although many people specifically mean the ability to recognize important points – and also to separate these from unimportant points.  From the standpoint of handling large amounts of data, this expertise actually relates ontology. 

A person familiar with production might suggest that additional insights could be gained by breaking down the production data by its contributors – the employees.  By the way, this is not a feature built into spreadsheet programs.  The data logging system might keep detailed stats on the work that each employee does; from there it is a simple matter of deconstructing by day.

The chart below contains the data for Monday deconstructed to show the individual contributions of sales people.  The term “production” tends to be associated with factory operations.  However, given that the workers are sales people, production in this case relates to units of product sold.  Saying that a worker is “productive” means that they sold many units.  What does it mean to sell a unit?  Perhaps an order was taken; it was paid; it was shipped from distribution; it was received by the client; it was received and not returned after 30 days; after 60 days; after 90 days.  Technical deconstruction can be really exact.  In short, there are many ways to expand and filter the original analysis.

Technical Deconstruction

Notice how the sales people have different contribution levels.  Before cloning agent #3 (the best performer) and getting rid of the others, keep in mind that not everyone is necessarily dedicated to sales.  For example, agent #2 (the worst performer) might be the manager whose metrics involve responsibilities not covered in the analysis.  This person might deal with only the most difficult clients.  The chart below shows the trendlines with the contribution patterns deleted.  I consider it critical to create a composite profile for each employee in order to understand their overall contribution.

Technical Deconstruction

Technical deconstruction is not a process void of controversy.  For example, is there really enough of a difference in production to justify a separation by day?  Are the metrics of contribution intrinsic to the employees; or are they related to the workplace or its processes?  Deconstruction creates benchmarks to support further investigation: since agent #3 does well on Monday, the next questions to consider are why and how.  Maybe agent #3’s superior performance is inconsistent – declining on Tuesday and Wednesday.  There need for analysis seems continuous – especially since new data is being created each production day.  Agents change.  The clients change.  It is necessary to question and test what methods by which individuals seem most and least successful.

Deconstruction is limited to some extent by one’s understanding of business – specifically the business of the particular company being studied.  (Not all businesses are alike.)  Usually understated is how analysis can be impaired by one’s unfamiliarity of people and their disabling environments.  For instance, agent #4 possibly never received training to use the applications on her system; or perhaps her computer system is not entirely compatible with the applications.  Portraying her metrics as only “moderate” or “poor” does little to improve production at the company.  But at least managers are peering under the right stones and engaging the numbers – as opposed to broad-brushing the entire staff and making decisions based on perceptions and unqualified opinions.

Tags: