Home » Business Topics » Data Strategist

Moving Beyond the 9-to-5

AdobeStock_203885782

As the Pandemic wanes (more or less), the debate about going back to the office vs. continuing to work from home remains in full swing. Central to this debate is the question about whether it is, in fact, better for companies for people to work from an office than it is to work remotely. The answers to this can be wildly divergent, from those who believe that productive work can only be done in an office, where resources can be consolidated, and people can meet, face to face, with one another for collaboration, to those who see work better done when the workers essentially control their own schedules and workflows.

To that end, one of the fundamental questions in this debate is what, exactly it means to be productive. Productivity has been an integral part of the work environment for more than a hundred and twenty years, yet it is also something that is both poorly defined and quite frequently massively misused. To understand this, you have to go back to Frederick Taylor, who first defined many of the principles of the modern work environment around the turn of the Twentieth Century.

How Frederick Taylor Invented Productivity

Moving Beyond the 9-to-5 
Frederick Taylor, Genius or Con Man?

Taylor was an odd character to begin with. He was born to a fairly wealthy family and managed to get admitted to Harvard Law School, but due to deteriorating eyesight, he decided to go into mechanical engineering instead, working first as an apprentice and later master mechanic at Midvale Steel Works in Pennsylvania, eventually marrying the daughter of the president of the company while working his way up from the shop floor to sales and eventually to management.

From there, Frederick Taylor began putting together his own observations about how inefficient the production lines were, and how there needed to be more disciplined in measuring productivity, which at the time could be measured as the number of components that a person could produce in a given period of time. In 1911, he wrote a monograph on the subject called The Principles of Scientific Management, which generalized these observations from the steel mill to all companies.

Taylor’s work quickly found favor in companies throughout the United States, where his advocacy of business analytics, precision time-keeping, and performance reviews seemed to resonate especially well in the emerging industrial centers of the country. At the same time, the data that he gathered was often highly suspect – for instance, he would frequently use the measurements of the fastest or strongest workers as the baseline for all of his measurements, then would recommend that owners dock the pay of workers that couldn’t reach these levels. He also mastered the art of business consulting, pioneering many of the techniques that such consultants would use to sell themselves into companies decades later.

Productivity was one of his inventions as well, and it eventually became the touchstone of corporations globally – a worker’s output could be measured by his or her productivity: the number of goods they produced in a given period of time. However, even this measure was somewhat deceptive, first, because this measure was at least in part determined by the automation inherent within an assembly line, and in part assumed that production of widgets was the only meaningful measurement in a society that was even then shifting from agricultural to industrial, while other factors such as quality or complexity of the products, physical or mental states of the workers, or even stability of the production line were ignored entirely.

42b0f8c6fcffeee8d41319aba1cfa3a6
Do Not Fold, Spindle or Mutilate

Productivity In The Computer Age

Automation actually made a hash of productivity early on. An early bottling operation for beer usually involved manually filling a bottle, then stoppering it. A skilled worker could get perhaps a dozen such bottles out a minute and could sustain that for an hour or so before needing to take a break. By the 1950s, automation had improved to the extent that a machine could fill and stopper 10,000 bottles a minute, a thousandfold increase in productivity. The bottler at that point was no longer performing the manual labor, but simply ensuring that the machine didn’t break down, that the empty bottles were positioned in their lattice, and that the filled ones were boxed and ready for shipment. Timing the bottler for filling bottles no longer made any sense but still, the metric persisted.

Not surprisingly, corporations quickly adopted Taylorism for their own internal processes. People became measured by how many insurance claims they could process, despite the fact that an insurance claim required a decision, which meant understanding the complexity of a problem. Getting more insurance claims processed may have made the business run faster, but it did so at the cost of making poorer decisions. It would take the rise of computer automation and the dubious benefits of artificial specialized intelligence to get to the point where semi-reasonable decisions could be made far faster, though the jury is still out as to whether the AI is in fact any better at making the decisions than humans.

Similar productivity issues arise with intellectual property. In the Tayloresque world, Ernest Hemingway was terribly unproductive. He only wrote about twenty books over his forty years of being a professional writer, or one book every two years. Today, he could probably write a book a year, simply because revising manuscripts is far easier with a word processor than a typewriter, but the time-consuming part of writing a book – actually figuring out what words go into making up that book – will take up just as long.

Even in the world of process engineering, in most cases what computers have done is to reduce the number of separate people handling different parts of the process, often down to one. Forty years ago, putting together a slide presentation was a fairly massive undertaking that required graphic artists, designers, photographers, copywriters, typographers, printers, and so forth weeks. Today, a ten-year-old kid can put together a Powerpoint deck that would have been impossible for anyone to produce earlier without a half-million-dollar budget.

We are getting closer to that number being zero: fill in some parameters, select a theme, push a button, and *blam* your presentation is done. This means, of course, that there are far more presentations out there than anyone would ever be able to consume. and that the bar for creating good, eye-catching, memorable presentations becomes far, far higher. It also means that Tayloresque measurements of productivity very quickly become meaningless when measured in presentations completed per week.

That’s the side usually left out in talking about productivity. Productivity is a measure of efficiency and efficiency is a form of optimization. Optimizations reach a point of diminishing returns, where more effort results in less meaningful gains. That’s a big part of the reason that productivity took such a nosedive after the turn of the twenty-first century. Even with significantly faster computers and algorithms, the reality was that the processes that could be optimized had already been so tweaked that the biggest factor in performance gains came right back down to the humans, which hadn’t really been changed all that much in the last century.

A forum that I follow posed the question about whether it was better for one’s career to work in the office or work from home. A person made the comment that people who work remotely may get passed over for promotion compared to someone who comes in early and stays late because the managers don’t see how hard working the remote worker is compared to the office worker. This is a valid concern, but it brings back a memory of when I started working a few decades ago and found myself working ten and eleven-hour days at the office for weeks on end trying to hit a critical deadline. Eventually, I was stumbling in exhausted, and the quality of my work diminished dramatically. I was essentially also giving my employer three additional hours a day at no cost, though after a while, they were getting what they paid for.

Knowledge work, which I and a growing number of people do, involves creating intellectual property. Typically, this involves identifying structure, building, testing, and integrating virtual components. It is easy to tell at a glance how productive I am, both in terms of ascertaining quantity (look at the software listings or article page) as well as quality (see if it correctly passes a build process or read the process). This is true for most activities performed today. If there are questions, I can be reached by email or phone or SMS or Slack or Teams or Zoom or any of a dozen other ways. With most DevOps and continuous integration processes, a manager can look at a dashboard and literally see what I have worked on within the last few minutes.

In other words, regardless of whether you are working remotely vs. working in the office, there are ample tools that a manager has to be able to ascertain whether a worker is on track to accomplish what they have pledged to accomplish. This is an example of goal-oriented management, and quite frankly it is exactly how most successful businesses should be operating today.

AdobeStock_168282784
The Paycheck Was Never Meant To Measure Time

The Fallacy of the Paycheck and the Time Clock

So let’s talk a little bit about things from the perspective of being a manager. If you have never done it before, managing a remote workforce is scary. Most management training historically has focused on people skills – reading body language, setting boundaries, identifying slackers, dealing with personal crises, and most importantly, keeping the project that you are managing moving forward. Much of it is synthesizing information from others into a clean report, typically by asking people what they are working on, and some of it is delegating tasks and responsibilities. In this kind of world, there is a clear hierarchy, and you generally can account for the fact that your employees are not stealing time or resources from you because you watch them.

I’ll address most of this below, but I want to focus on the last, italicized statement first because it gets into what is so wrong about contemporary corporate culture. One place where Tayloresque thinking embedded itself most deeply into the cultural fabric of companies is the notion that you are paying your employees for their time. This assumption is almost never questioned. It should be.

Until the start of the middle of the Industrial Age, people typically were paid monthly or fortnightly if they were the employee of a member of the nobility or gentry, or produced and sold their goods if they were craftsmen or farmers, or were budgeted an account if they were a senior member of the church. Often times such payment partially took the form of room and board (or food) or similar services in exchange. Timekeeping seldom entered it – you worked when there was work to be done and rested when the opportunity arose.

Industrialization brought with it more precise clocks and timekeeping, and you were paid for the time that you worked, but because of the sheer number of workers involved, this also required better sets of accounting books and more regular disbursement of funds for payment. It was Taylor that quantized this down to the hour, however, with the natural assumption that you were being paid not per day of work but for ten hours of work a day. This was also when the term work ethic seemed to gain currency, the idea being that a good worker worked continuously, never complained, never asked for too much, and bad workers were lazy and would steal both resources and time from employers if they could get away with it.

In reality, most work is not continuous in nature but can be broken down into individual asynchronous tasks of activity within a queue. It can be made continuous if the queue is left unattended too long or if the time to complete a task increases faster than the rate at which tasks are added to the queue. Office work, from the 1930s to the 1970s, usually involved a staff of workers (mainly female) who worked in pools to process applications, invoices, correspondence, or other content – when a pool worker was done, she would be assigned a new project to complete. This queue and pool arrangement basically kept everyone busy, further cementing the idea that an employer was actually paying for the employee’s time, especially since there was usually enough work to fill the available hours of the day.

That balance shifted in the 1970s and 80s as the impact of automation began to hit corporations hard. The secretarial pool had all but disappeared by 1990 with the advent of computers and networking. While productivity shot up – fewer people were doing much more “work” in the sense that automation enabled far more processing – people began to find themselves with less and less to do and made it possible for companies to eliminate or consolidate existing jobs. A new generation picked up programming and related skills, and the number of companies exploded in the 1990s as entrepreneurs looked for new niches to automate as the barrier to entry for new companies dropped dramatically.

AdobeStock_292999037
By focusing on demonstrable goals rather than “seat-time”, organizations can become more data-oriented.

The WFA Revolution Depends Upon Goals and Metrics

Since 2000, there have been three key events that have dramatically changed the landscape for work. The first was the rise of mobile computing, which has made it possible for people to work anywhere there is a network signal. The second was the consolidation of cloud computing, moving away from the requirement that resources need to be on the premises. Finally, the pandemic stress-tested the idea of work virtualization in a way that nothing else could have and likely forcing the social adoption of remote work about a decade earlier than it would have taken otherwise.

Productivity through automation has now reached a stage where it is possible to

  1. get reliable metrics based upon work completed towards specific goals, regardless of time specifically spent,
  2. automate those tasks which do not in fact require more than minimal human intervention,
  3. get access to resources needed to accomplish specific tasks, regardless of where those tasks are accomplished
  4. provide a superior environment for meeting virtually across multiple time zones, creating both a video and transcript artifact of such meetings,
  5. provide tools for collaborating in the same way, either synchronously or asynchronously (addressing the water cooler problem)
  6. ensure that information remains secure
  7. provide a set of eyeballs on evolving situations anywhere in the world at any time

Put another way – remote force productivity is not the issue here.

Most people are far more productive than they have ever been, to the extent that it is becoming harder and harder to fill a forty-hour week most of the time. I’d argue that when an employer is paying an employee, what they should be doing is spreading out a year-long payment into twenty-six chunks, paying not for the time spent but the availability of the expertise. That the workweek is twenty hours one week and fifty hours next is irrelevant – you are paying a salary, and the actual number of hours worked is far less important than whether in fact the work is being done consistently and to a sufficiently high standard. This was true before the pandemic, and if anything it is more true today.

Businesses began in the 1970s to start pushing labor laws so that companies could classify part-time workers as hourly – this meant that, rather than having a minimum guaranteed total annual income, such workers were only paid for their time on-premises. By doing so, such workers (who were also usually paid at or even below a minimum wage), would typically be the ones to bear the brunt if a business had a slow week, but were also typically responsible for their own healthcare and were ineligible for other benefits. In this way, even if on paper they were making $30,000 a year, in reality, such workers’ actual income was likely half that, even before taxes. By 1980, labor laws had effectively institutionalized legalized poverty.

After the pandemic, companies discovered, much to their chagrin, that their rapid shedding of jobs in 2020 came back to bite them hard in 2021. Once people have a job, they develop a certain degree of inertia in looking for a new job, and often times may refuse to look for other work simply because switching jobs is always somewhat traumatic. This also tends to depress wage growth in companies, because most companies will only pay a person more (and even then only to a specific minimum) if they also take on more responsibility (in other words, new hires generally make more than existing workers for the same positions).

At the bottom of the pandemic bust, more than 25 million people were thrown out of work, deeper even than during the Great Depression. The rebound was fairly strong, however. It meant that suddenly every company that had jettisoned workers was now trying to rehire new workers all at once. For the first time in a generation, labor had newfound bargaining strength. This also coincided with a long-overdue generational retirement of the Boomers and the subsequent falloff in the number of GenXers, which overall is about 35% smaller than the previous generation. Demographic trends hint that the labor market is going to favor employees over employers for at least the next decade.

Given all that, it’s time to rethink productivity in the Work From Home era. The first part of this is to understand that work has become asynchronous, and ironically, it’s healthier that way. There will be periods of time when employees will be idle, and others where employees will be very busy. Most small businesses implicitly understand this – restaurants (and indeed, most service economy jobs) have slack times and busy times. Perhaps it is time for “hourly” workers to go back to being paid salaries again. This way, if someone is not needed on-site at a particular, sending them home doesn’t become an economic burden for them. On the flip-side, that also puts the onus on the worker that, should things get busy again, they remain reachable in one of any number of ways.

Once you move into the knowledge economy, the avenues for workers become more open. Salary holds once more here, but so too does the notion of being available at certain times. I’ve actually seen an uptick in the number of startup companies that utilize Slack as a way of managing workflow, even in service sector work, as well as indicating when people need to be in the office versus simply need to be working on projects.

I am also seeing the emergence of a 3-2-2 week: three days that are specifically set aside for meetings, either onsite or over telepresence channels such as Zoom or Teams, two days where people may be on call but generally don’t have to meet and can focus on getting the most productivity without meetings interrupting their concentration, and two days that are considered “the weekend”. When workloads are light (such as during summers or winter holidays), this can translate into “light” vacations, where people are just putting in a couple of hours of work a day during their “Fridays” and are otherwise able to control their schedules. When workloads are heavy (crunch time) the bleed even into the weekend CAN happen, so long as it’s not done for an extended period of time.

Asynchronous, goal-oriented, and demonstrable project planning also becomes more critical in the Work From Home era. This, ironically, means that “scrum”-oriented practices should be deprecated in favor of being able to attach work products (in progress or completed) to workflows – whether that’s updating a Git repository, publishing a blog, updating a reference standard, or designing media or programmatic components. Continuous integration is key here – use DevOps processes to ensure that code and resources are representative of the current state of the project and that provide a tracking log of what has been done by each member of a given team.

9401571288
Micromanagement, abusive behavior, and political games – is it any wonder people are staying away from the office?

Management Needs to Evolve Too, Or Be Left Behind

For production teams, this should be old-hat, but it’s fairly incumbent that management works in the same way, and ironically, this is where the greatest resistance is likely going to come from. Traditional management has typically been more face-to-face in interaction (in part because senior management has also traditionally been more sales-oriented). The more senior the position, the more likely that person will need comprehensive real-time reporting, and the more difficult (and important) it is to summarize the results from multiple divisions/departments.

Not surprisingly, this is perhaps the single biggest benefit of a data-focused organization with strong analytics: It makes it easier for managers to see in the aggregate what is happening within an organization. It also makes it easier to see who is being productive, who is needing help, and who, frankly, need to be left behind, which include more than a few of those same managers.

https://www.theatlantic.com/ideas/archive/2021/07/work-from-home-be…

You cannot talk about productivity without also talking about non-productivity. This doesn’t come from people who are genuinely trying but are struggling due to a lack of resources, training, or experience. One thing that many of these same tools can do is to better highlight who those people are without putting them on the spot, and a good manager will then be able to either assign a mentor or make sure they do have the training.

Rather, it’s those workers who have managed to find a niche within the organization where they don’t actually do much that’s productive, but they seem to constantly be busy. Work from home may seem to be ideal here, but if you assume that this also involves goal-oriented metrics, it actually becomes harder to “skate” when working remotely, as there is a requirement for having a demonstrable product at the end of the day.

Finally, one of the biggest productivity problems with WFH/WFA has to do with micromanagement as compensation for being unable to “watch” people at work. This involves (almost physically) tying people to their keyboards or phones, monitoring everything that is done or said, and then using lack of “compliance” as an excuse to penalize workers.

During the worst of the pandemic, stories emerged of companies doing precisely this. Not surprisingly, those companies found themselves struggling to find workers as the economy started to recover, especially since many of these companies had a history of underpaying their workers as well. Offices tend to create bubble effects – people are less likely to think about leaving when they are in a corporate cocoon than when they are working from home, and behavior that might be prevalent within offices – gas-lighting, sexual harassment, bullying, overt racism, bosses not crediting their workers, and so forth – can be seen more readily when working away from the office as being unacceptable than they can when within the bubble.

There are multiple issues involved with WFH/WFA that do come into play, some legitimate. However, making the argument that productivity is the reason that companies want workers to come back to the office is at best specious. While it is likely more work for managers, a hybrid solution where the office essentially becomes a place where workers congregate when they do need to gather (and those times certainly exist) likely is baked into the cake by now especially as the Covid Delta variant continues to rage in the background. It’s time to move beyond Taylorism, and the fallacy of the time clock.

Kurt Cagle is the Community Editor of Data Science Central, a TechTarget property.