When I talk about "the institutional response," I am referring to an increasingly common occurrence: a standardized or large-scale approach is supported, promoted, and applied by a particular institution - sometimes governmental in nature - premised on its apparent suitability or superiority to achieve desirable outcomes. I suspect that in recent years, there has been a push to get citizens to file their income tax returns electronically. I know that in Canada, it has become difficult to find any contact information to speak with a live person. For an elderly individual unfamiliar with the internet, somebody who is visually-impaired, or anyone who can't get through the processes developed by system designers, it might be a rough ride. In this blog, I will be focusing on how an institutional response can in principle be data-deprived. I don't mean that such as response is brought about without data - sometimes there can be a great deal - but rather that that post-response data collection is not necessarily given any emphasis. A lot can go wrong. Yet, due to lack of data collection, those responsible for the system might be unaware of faults in the process - except in relation to metrics anticipated in early development. Writing from my own experience, I noticed that the technical support line to our tax service is busy right up to midnight at which point the support line closes. It can be quite difficult to advise the tax service of problems under such conditions.
I said that institutional responses are becoming increasingly common because it represents a cheap and simple approach to complex problems. I appreciate the need to save money and keep life simple. In my prior blog, I wrote about systematically scanning emergency response plans - intended to guide a municipality during an emergency - in search of keywords indicating sensitivity to vulnerable groups. I offered a conceptual reason for doing so. Setting aside the merits of my conceptualization, I also point out that there is a recognized need to consider such individuals in emergencies: e.g. A.6.8.2 of NFPA1600, a standard for emergency management and business continuity. My data scanning is premised on the certain measurable commonalities existing in these plans regardless of the filer. Assuming that there are substantial commonalities, a municipality might be tempted take a boilerplate approach: without necessarily collecting much data or having detailed discussions, it could copy and customize the plans that others have filed. Since the requirement to file is institutional - as is the submission - I cannot say that such an approach would necessarily lead to complaints. Parts of the submission might be confusing, nonsensical, or out of place. For example, not all municipalities are necessarily located near a nuclear power facility or hydro electric station. The critical infrastructure differs from place to place.
In Ontario, there is a requirement for exercises and tests. Perhaps the situation is similar in other jurisdictions. The requirement is also covered in Chapter 8 of NFPA 1600. It would be difficult for these exercises and tests not to generate data. There is likely some recognition that the planning process and the plan itself represent only starting points - a beginning rather than end. I know that there might be some debate as to how effectively data gets collected during exercises and in what form. Nonetheless, there is a great opportunity to collect data; and doing so can contribute to improvements to the plan. I do not dismiss the institutional response - assuming that it sensitizes rather than insulates structural capital. Structural capital by the way is what persists to maintain processes in a company even as its employees change. I notice that others sometimes associate it with policies and procedures. I normally use the term to refer to systems of organizational learning.
I personally can understand how, amid the fast-paced and uncontrolled circumstances of an actual emergency, it can be difficult to collect data. The system in place might not be designed to gather data on-the-fly - although really, it should be. It is probably much easier to collect data during an exercise or simulation. For me, the whole point of the undertaking is to gather information - to report and share findings - to help bring about improvement to the plan. In the process, the amount of learning that occurs among participants is likely rather high. I have often speculated - going off on a tangent for just a moment - how the animated nature of infants and unstable nature of adolescents might represent an evolutionary tactic to deal with high vulnerability in the absence of data. There is this situation of high vulnerability and change in the face of all sorts of hazards and dangers in the environment. It is necessary to conceptualize and question existing presumptions really quickly. Certainly, for an organization facing change, perhaps one route to failure is through the implementation of an institutional response that is inherently insulated from both external and internal environments (external being the market - internal being operations). The difference is in the data. Data collection and responding to that data might help organizations achieve worthwhile outcomes much better than simply introducing a "response" - regardless of how authoritative or deliberated that initial plan might be.
I wrote my undergraduate thesis on the effectiveness of public participation in local planning. The important word here is "effectiveness." There is a requirement to give residents an opportunity to participate in developments. The process of development is highly institutionalized. Yet there is this idea that all sorts of data must to be collected not just relating to construction but also the sentiments of those living in the area. Opposition sometimes exists, but of course all sorts of developments still get approved. Participation does not necessarily interfere with development. However, I have discovered that it is possible to have participation in a process where it doesn't actually occur in a tangible sense. The meeting could exist purely to fulfill an institutional requirement.
Apart from cheapness and simplicity, perhaps another reason for the emergence of the institutional response relates to the prevalence of professionals in business. How often does real research get conducted - gathering real data - in a business setting? I find that business solutions are becoming increasingly institutional in nature. Consider a response that takes almost no data: in order to deal with rising costs, cut costs. Some might call this a knee-jerk reaction. Never mind about the possibility that specific investments might help to maintain access to particular markets or to competitive advantage. Growth metrics don't necessarily get along with austerity metrics. The need to be sensitive the environment might seem less important if an organization is focused on implementing a plan. I have noticed that, in the face of uncertainty, rather than collect data, decisions might be guided by certain professional biases disguised as competencies. I know that among data scientists, there is sometimes this discussion on the relative merits of evidence-based decision-making versus the use of gut instincts. Well, sometimes a gut instinct is really a professional bias. We give the neural network of the digestive system too much credit.
I started off this blog discussing the push for people to file and handle their tax returns online. I should clarify that I prefer and enjoy doing my taxes electronically. Combining my preference with the underlying rationale - probably to cut costs - I don't question the initial decision. But any mass-service system has to have a means to test for operational problems. This is quite difficult to do without a method of collecting data "outside" the expected parameters. If all problems fell within parameters, these wouldn't really be problems but merely challenging aspects of routine operations. Invariably, there must first of all be a data system to catch the errors, faults, malfunctions, complaints, outages, and difficulties; then there have to be metrics or indicators of reasonable diversity to warn of peculiarities beyond those anticipated during design. The more heavily we rely on institutional responses, the greater the need to incorporate safety, security, and service mechanisms. A response is poor substitute for responsiveness. A plan that has no data to support adaptation and change probably will not do so. There must always be sensitivity in every institutional process; this cannot be achieved without collecting and making use of substantial amounts of data.