How the Computer Emancipated the American Corporation

The Information Age Empowers Workers and Disempowers Managers

Larry Schweikart teaches history at the University of Dayton.

It’s pretty common knowledge that we have entered the “information age” and that information technologies have dramatically changed business in America and in the rest of the world. Currently, there is a heated debate raging about the standard of living in the United States—particularly in the middle class—and the degree to which computers have raised it, if at all. How one views the information revolution tends to shape the response to this issue. In fact, though, the most significant changes associated with the introduction of the computer are often misidentified as deriving from other factors. This, in turn, has obscured the most significant trend in American business and economic history in the last century, namely, that the computer has emancipated the American corporation from a century of statist-oriented, planning-centered managers.

First, it is worthwhile to see where we are in the debate about the “information revolution” in the year 2001. The public discussion about the impact of the computer has tended to focus on blue-collar wages, middle-class living standards, inflation, and unemployment. On one side are those who claim that the middle class is falling behind and that real wages have not risen commensurately with either overall economic growth or with productivity increases. Proponents of this view maintain that the U.S. economy has drifted back toward inflation and that government expenditures have not been controlled, even by six years of Republican congresses. Those on this side of the argument include Pat Buchanan, Richard Gephardt, and Robert Reich.

When viewed purely in terms of wages—especially blue-collar paychecks—these critics are completely right. Few would doubt that steel or auto workers in the 1970s had far higher real earnings (after adjusting for inflation) than they do today, especially after factoring in their benefit packages. David Halberstam’s book about the decline of the American auto industry, The Reckoning, noted that in the early 1970s, line auto workers had two cars, a boat, and a vacation house on the lake.

But following the steel and auto shakeout in the 1980s, in which thousands of employees were fired, most never to be rehired, those who remained had to give back benefits and/or settle for wage limitations. Nevertheless, American business shifted steadily into an “information economy,” and even in hard-core manufacturing areas, such as steel, the successful companies used computers and robotics to achieve important productivity gains.

It is true also that many service workers do not make the wages of those in unionized steel and auto manufacturing, although the counter help and secretarial jobs are emphasized far too much over the attorneys, accountants, software engineers, production designers, and other “service” areas that make up the meat of the new economy.

Overall, this view misses the fact that real wealth gains have occurred through savings—through the IRAs and other pension/retirement accounts that were sheltered from outrageous taxation over the last 20 years. With the stock market boom of the 1990s, Americans saw this element of their wealth rise 20 percent, 30 percent, or higher. Without question, many were left out, especially those in jobs without pension funds, or those who worked in service businesses that did not leave enough from their paychecks to invest. While this group may have been a sizable minority, it nevertheless was a minority. Statistics show that a majority of Americans now have investments in the stock market. To ignore this is to deny reality.

The other part of the wealth gain for the middle class have come in the form of housing values, which also have risen. While it is difficult to monetize the gains from one’s house—people usually don’t want to move just to make a quick buck—again these gains are real, and to exclude them distorts the financial picture. Once both these gains are factored in, middle-class Americans have gained ground.

In fact, while economists and policymakers have argued about what the new economy is doing to the middle class, they have been blind to the most significant business change in our generation, namely, the demise of the “visible hand” of managerial hierarchies. The term “visible hand,” derived from Harvard business historian Alfred Chandler’s 1977 prize-winning book of that title, refers to the active role of managers in controlling and (to use Chandler’s favorite word) planning the economy. To appreciate the dynamic and earthshaking transformation of business—and the liberation offered by the computer—it is worthwhile to review Chandler’s hypothesis.

Separating Ownership and Management

Chandler argued that owner-operated businesses proved inadequate to handle the speed, scale, and scope of technological change in the 1850s. This was especially obvious, he claims, in railroads, where the sheer mileage and difficulty of maintaining schedules made it impossible for one owner to direct the firm’s affairs. Railroads responded by separating ownership from management. By that time, the owners were usually stockholders because the railroads’ capital needs were so great that they had to issue securities. The stockholders then elected a president or chairman who would direct the company’s activities.

This separation of ownership and management had several implications. First, the managerial class (Chandler claims) began to exert its control over production so as to smooth out the unexpected effects of suppliers outside the firm’s ownership. Chandler called this process the “visible hand” of management, which he said replaced the “invisible hand” of the market. Leaving aside for a moment the truth of that assertion, Chandler correctly observed that the managerial “hierarchies” (top management, middle management, and so on) soon saw control of the product as the most effective means of competition—far better than focusing on driving competitors out of business. The implication of this was enormous: the managerial hierarchies became extremely conservative, preferring a 2 percent per-year profit that could be relied on to smooth out sharp swings between high profits and deep losses that could not be forecast.

Lest that seem unreasonable, it is necessary to recognize that the manager mentality centered almost exclusively on efficiency gains and productivity within the corporation. Managers saw their central problem in their ability to control their product and plan its production. Thus they engaged in “backward integration” to obtain sources of raw materials—cattle farms, iron ore pits, and so on—and “forward integration” to purchase retailers. In theory, a sharp manager could control the flow of production from its origins in an ore pit to its sale in a store. The competitors were increasingly less important to managers: like a successful football coach, managers thought that if their “teams” ran the plays “the right way,” they would work every time.

Therefore, corporations became much more conservative and less willing to take risks. Even research-and-development departments (R&D) were locked into this mentality, so that they essentially only made marginal improvements to the firm’s existing products—but never provided a true revolution or genuinely radical product. Worse, perhaps, the “numbers guys” started to dominate corporations over the “production guys.” Accountants, financial divisions, and managers with a facility with the statistics had a huge advantage over people who knew the business and who understood the necessary touch of Zen required to turn out successful products, but who were at a disadvantage in the executive meetings when challenged by the green-eyeshade crowd.

This has seriously affected corporations’ ability to make radical advances or develop truly new products, at least deliberately. Burton Klein, looking at the top 50 technological breakthroughs of the twentieth century in the United States, found that not one came from the leader in the field. Rather, all the breakthroughs came from unknowns, some of them not even in the same field. For example, Henry Ford was not a buggy manufacturer; the Wright brothers were not balloon makers, and, more recently, the personal computer did not come from any of the established companies in the computing field. This makes complete sense if one accepts Chandler’s premise that the corporations become defensive under the managers.

Managers, Information, and the Need to Know

There was another, perhaps more important, aspect of the managers that Chandler seems to miss. Managers added value by being facilitators of information—conduits for moving data on a need-to-know basis from the top of the corporation down. In the nineteenth century this made sense. Employees often were uneducated, many of them coming straight from Europe with poor language skills. Moreover, the process of information transfer was painstakingly slow: the telegraph was the fastest form of communication until about 1900, although telephones had started to make inroads in large cities. But one could not rely on phones to transmit scheduling information or sudden production changes to more remote areas until nearly the turn of the century. Banking was still largely handled by mail, with accounts handwritten and entries balanced by hand at the end of the day. In short, the combination of slow communications and an uneducated workforce made it reasonable and efficient for managers to make decisions about which information to sift down to the employees.

While this changed some between 1900 and 1960, the growth of the corporations and the expansion of markets worldwide hid many of the inefficiencies in this system. Indeed, large corporations had started to squeeze greater efficiencies out of their managerial structures through bonuses, perks, and a corporate culture that rewarded loyalty and conformity. The infamous Man in the Grey Flannel Suit and Organization Man, while appropriate in their concerns about the standardization of life, missed the economic logic of such managerial hierarchies. And despite widespread use of telephones, corporations still depended almost exclusively on the transmission of data and instructions by paper. The interoffice memo became an urban legend in the 1950s.

What was no longer true, however, was that the workforce was either uneducated or uninterested in the firm’s activities. Employees now had the ability to process the data, and to analyze information for themselves, but had no way to obtain it, except through the top-down managerial structure. Perhaps the opposite of the situation of the late 1800s had surfaced, in which by the 1970s large numbers of workers were actually overeducated for the tasks they were assigned. Yet the company still ran according to the nineteenth-century model, with the managerial hierarchies treating even relatively high-level executives as mere receptacles of information, which the managers, in their Zeus-like positions, dispensed from on high. American productivity in steel, autos, electronics, and other sectors began to wilt in the 1970s—for a host of reasons. But among them was a managerial design that was simply obsolete.

The computer pushed this teetering structure over the edge, especially after the advent of the personal computer in the early 1970s. But already, the copy machine had made it possible for many employees to read and absorb previously “internal” documents. Guerrilla efforts to leak corporate documents (for a variety of motivations) showed how futile it would be for firms to try to keep the lid on the information explosion much longer. By the 1980s, personal computers had started to filter into almost all corporations. Then by the late 1980s, these computers linked employees together through electronic mail. What one employee had, others could get. Soon, only the most secure information about the company was inaccessible.

With the rise of the Internet, though, virtually all information was available. Even quasi-secure corporate information often was pried out by Web sites and hackers, but even without that specific data, rank-and-file employees could get all but the highest-level information for a company. In short, the managers’ role as conduits of information was sharply compromised, if not eliminated altogether.

Actually, worse: the managers now, trying to move information to the divisions that needed it most, found themselves completely overloaded. As they tried to separate “important” from “trivial” data, they applied a standard of judgment that went beyond the abilities of even exceptional men and women. There was simply too much information to sift through, and not enough time. Managers became bottlenecks, not transmitters, of information. In human terms, they became “switches,” unable to direct or route commands fast enough, outflanked by the “wires” of their employees who had the information.

The corporations’ profit sheets told them something was wrong, but, typically, companies could not easily identify the source of change. They knew it involved the managers, but could only assess the problem in terms of “lower productivity” or “falling effectiveness.” Firms knew the trouble rested in the ranks of management, but did not know why. Hence, the 1990s had the great white-collar shakeout, replete with the Newsweek cover “Corporate Killers” and the New York Times series about the “battlefield of business” where there were “casualties.”

Liberation Technology

In fact, what had happened was that the computer had liberated the corporation. Throughout the 1980s, management gurus encouraged American companies to restyle themselves in the image of the “more efficient” Japanese. “Kaizen”-style management, touted as the solution to American industry’s falling productivity, offered a silver bullet. There was an element of truth to this, although the analysis often missed the essential dynamism of the Japanese system: it encouraged employees to give constant feedback about the production processes.

However, most of those calling for Japanese management practices saw the reason for American decline in the habits and character of the managers themselves. They tended to accept uncritically Japan’s own propaganda about “Samurai management.” Instead, the Japanese at a relatively low level had identified the benefits of rapid information transmission in both directions—from the bottom up and from the top down. Significantly, Japan’s Ministry of International Trade and Industry tended to block further information transmission from corporations (that is, the market) upward. Or more appropriately, the Japanese government managers often did not follow the successful practices adopted by business at the lower levels. Over time, though, considerable discipline was imposed by the securities markets. In a sense, Japan’s fade in the 1990s, and the collapse of its securities base, reflected the same forces at a higher plane than what was occurring in America with the white-collar layoffs. Information was being blocked or ignored. Managers interfered with information transmission.

Keep in mind that information is neutral, making itself available to whoever chooses to apply it. American companies, whether they understood the phenomena fully or not, sensed the productivity implications in the 1990s. This helped fuel the remarkable stock market boom that only recently has receded. More than anything, the Great Bull Market of the ’90s was a tech market—an information market. Consistently pegged as “overvalued,” the fact is that until industry fully understands how much information it can process, and how much the information it does process will improve productivity, no one knows what the value of the market “should be.” That has led columnist James Glassman to write a book asking if a “30,000 Dow” were in sight. Tech guru George Gilder continues to argue that the stock market doesn’t even begin to properly value America’s corporate worth, let alone the impact of the new technologies.

The computer has liberated the corporation from the tyranny of the managers, who had imposed a planning-oriented model of low-growth expectations on it. Corporations rightly were criticized as falling under the sway of the “bean counters,” a Robert McNamara-esque generation of “numbers men” (and now, “numbers women” too) whose deity is the balance sheet. But entrepreneurs know that much business success comes from “hunches,” a sense of timing, and intimate knowledge of the customers.

Keeping corporations entrepreneurial is a task well-suited to the computer culture, because it shifts ordering, marketing, and sales to the point of contact in the market itself. Admittedly, there are important costs, and the computer is no panacea. The computerization of food checkout has eliminated the friendly conversations between cashiers and customers, both of whom knew each other’s names. But in fact that connection was severed years ago by the optical scanner, which all but eliminated small talk. Since virtually any person could be trained on a scanner quickly, it also eliminated the middle-aged, well-paid cashiers of the type who used to work with me at the family-owned grocery store. Now, tattooed and pierced teens perform those functions at much lower salaries.

But here is where the manager again can emerge to reclaim an important role: the manager now is more than a supervisor of cashiers or checkout people. He is the point of sale to the consumer. Again, this has costs and benefits. The cost is that the manager can no longer rely on a facility with numbers or a sterile balance sheet to justify his employment. Rather, the computer has re-imposed on the company the demand that managers actively represent the firm to the public, again becoming the owner in the absence of the owner. Harvard Business School types who thought they would never have to deal with people because they were “managers” will get a reality bath. The manager of the future will be all about dealing with people, especially customers.

But employees, as well, are now liberated in a sense. If wages have fallen, access to information has empowered. Ambitious workers—but only the ambitious—will find that the access provided by computerization of inventories, records, sales, and so on lays at their feet the guts of the firm’s activities. Certainly not all will have either the determination or the smarts to take advantage of such openness. For those who do, however, the world is their oyster.

At first, it might be argued that the effect of this is to create a “two-tiered system” in which a firm has large numbers of low-wage employees at the bottom and a handful of highly paid executives at the top. Indeed this structure might exist for a brief period until the corporation realizes that long-term success involves educating and motivating the low-wage employees at the bottom to take advantage of the information at their fingertips. This not only could return “control of the workplace” to the “shop floor,” in a cyber sense, but will revive the owner-operator at the top levels. This is seen (with a vengeance) in the Silicon Valley firms, where the “employees” are really combinations of workers, owners, and managers.

Granted, not all businesses lend themselves to this dynamic; or at least, so it seems today. But given the remarkable changes in business over the last 100 years, who’s to say that it will not become the working model of the future corporation? At any rate, the damage is done (from the perspective of the managerial hierarchies), or the blessings are bestowed (from the point of view of a George Gilder). There is no going back. The question is, now that they are essentially emancipated from the tyranny of the managers, what will the corporations of the 21st century do with their freedom?