Subscribe to DSC Newsletter

Creating Complex Encoded Objects from Qualitative Data

I have always found the task of converting qualitative data into something quantifiable a bit challenging.  A common route might be as follows:  assemble all of the resources containing qualitative information (e.g. questionnaires containing open-ended questions); seek out apparent themes in the responses; and quantify how frequently these themes are mentioned or raised.  This methodology leaves open the question of when something is or isn't a theme, and whether something must be a theme in order to be relevant.

Facing several dozen human rights tribunal cases in order to complete a paper on disability law, I decided to design a questionnaire that I could complete based on the qualitative details of each case.  Whereas an ordinary questionnaire often contains only a handful of open-ended questions, limiting the amount of information that must be handled, a tribunal case might be 10 to 60 pages in length.  A thematic comparison seemed pointless to me since the same themes occurred in each case.  (This is because I obtained the cases by using a search engine that screened for particular themes of interest.)  So I decided to develop a conceptual framework for the data.  Perhaps having a survey without a guiding concept would be a bit like open-pit data-mining - gathering facts in the hopes of finding something valuable at a later time.  Nonetheless, I still remember telling myself, my conceptually-driven approach seemed rather foolish since I had to know what I was looking for in advance; there was also a danger of going down a completely wrong path.

I decided to use something rather elaborate derived in part from a number of different behavioural models.  Not that the exact details are too important at this point, but I called it the ACTOR definition protocol.  For each case, I noted certain aspects of how the adjudicator made comments on the following:  Attraction - the perceptions of the businesses involved; Conduct - the extent to which the businesses took others into account; Tenacity - the amount of effort exhibited by the businesses; Organization - the systems and structures in place in achieve certain outcomes; and Responsibility - how the businesses recognized their roles.  I found that adjudicators tended to confirm when performance was inadequate, adequate, or more than adequate.  This meticulous delineation of discourse took some practice, but I eventually created a conceptual portrayal for each case that could then be quantified.

I want to point out that although conceptualization might reduce the amount of information, the resulting data object is not particularly reductive.  In fact, the ontological exercise was extreme by how it forced me to determine the placement of facts that seemed to have infinite shades of grey.  I came to realize, ACTOR is merely a starting point.  Many subdivisions could exist for each letter.  I was accessing the reality of the case only through this structural context in order to extract numbers, but the reality is much bigger than the quantification.  For instance, consider just the concept of Conduct: taking others into account - during the planning of policies - during the training of staff - while determining seating arrangements for patrons - to ensure accessibility - during the delivery of service - to manage competing interests - listen to complaints - and handle complaints.  There are so many shades of Conduct.  It seemed almost ruthless to assign a value of -1, 0, or 1 to the qualitative assertions.

I can add a small twist to this discussion.  Let us say that rather than do tribunal cases, I have a steady stream of client surveys.  Because the stream is perpetual, it seems impossible to know in advance the nature of all future responses.  There is a bit of a technical hurdle that I think interferes with the continuous integration of qualitative data.  Moreover, I might not know exactly how to make use of the data given that business needs are likely to change.  So it would certainly be tempting to force the data into its simplest form to respond to my immediate interests today irrespective of the changing reality behind the data.  I don't know if anybody has had the experience of posing questions that seemed worthwhile during development but were found to be unhelpful after distribution.  It is necessary to have systems and methodologies to accommodate emerging realities - particularly when an organization finds itself confronting unfamiliar surroundings.  Making effective use of qualitative resources might become increasingly important to help organizations adapt to change.

Views: 320

Comment

You need to be a member of Data Science Central to add comments!

Join Data Science Central

Videos

  • Add Videos
  • View All

© 2019   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service