Plus, receive recommendations and exclusive offers on all of your favorite books and authors from Simon & Schuster.
The Rule of Three
Surviving and Thriving in Competitive Markets
By Jagdish Sheth and Rajendra Sisodia
Table of Contents
About The Book
Name any industry and more likely than not you will find that the three strongest, most efficient companies control 70 to 90 percent of the market. Here are just a few examples:
Based on extensive studies of market forces, the distinguished business school strategists and corporate advisers Jagdish Sheth and Rajendra Sisodia show that natural competitive forces shape the vast majority of companies under "the rule of three." This stunning new concept has powerful strategic implications for businesses large and small alike.
Drawing on years of research covering hundreds of industries both local and global, The Rule of Three documents the evolution of markets into two complementary sectors -- generalists, which cater to a large, mainstream group of customers; and specialists, which satisfy the needs of customers at both the high and low ends of the market. Any company caught in the middle ("the ditch") is likely to be swallowed up or destroyed. Sheth and Sisodia show how most markets resemble a shopping mall with specialty shops anchored by large stores. Drawing wisdom from these markets, The Rule of Three offers counterintuitive insights, with suggested strategies for the "Big 3" players, as well as for mid-sized companies that may want to mount a challenge and for specialists striving to flourish in the shadow of industry giants. The book explains how to recognize signs of market disruptions that can result in serious reversals and upheavals for companies caught unprepared. Such disruptions include new technologies, regulatory shifts, innovations in distribution and packaging, demographic and cultural shifts, and venture capital as well as other forms of investor funding.
Years in the making and sweeping in scope, The Rule of Three provides authoritative, research-based insights into market dynamics that no business manager should be without.
- McDonald's, Burger King, and Wendy's
- General Mills, Kellogg, and Post
- Nike, Adidas, and Reebok
- Bank of America, Chase Manhattan, and Banc One
- American, United, and Delta
- Merck, Johnson & Johnson, and Bristol-Myers Squibb
Based on extensive studies of market forces, the distinguished business school strategists and corporate advisers Jagdish Sheth and Rajendra Sisodia show that natural competitive forces shape the vast majority of companies under "the rule of three." This stunning new concept has powerful strategic implications for businesses large and small alike.
Drawing on years of research covering hundreds of industries both local and global, The Rule of Three documents the evolution of markets into two complementary sectors -- generalists, which cater to a large, mainstream group of customers; and specialists, which satisfy the needs of customers at both the high and low ends of the market. Any company caught in the middle ("the ditch") is likely to be swallowed up or destroyed. Sheth and Sisodia show how most markets resemble a shopping mall with specialty shops anchored by large stores. Drawing wisdom from these markets, The Rule of Three offers counterintuitive insights, with suggested strategies for the "Big 3" players, as well as for mid-sized companies that may want to mount a challenge and for specialists striving to flourish in the shadow of industry giants. The book explains how to recognize signs of market disruptions that can result in serious reversals and upheavals for companies caught unprepared. Such disruptions include new technologies, regulatory shifts, innovations in distribution and packaging, demographic and cultural shifts, and venture capital as well as other forms of investor funding.
Years in the making and sweeping in scope, The Rule of Three provides authoritative, research-based insights into market dynamics that no business manager should be without.
Excerpt
Chapter 1: Four Mechanisms for Increasing Efficiency
In 1966, the U.S. Supreme Court refused to allow two supermarkets in Los Angeles to merge. The Vons Grocery Company and Shopping Bag Food Stores, had they been allowed to combine, would have controlled a whopping 7.5 percent of the market. Over 3,800 single-store grocers would still have been doing business in the city. In spite of these statistics, the Court ruled against the merger, citing "the threatening trend toward concentration."
Much has changed in the public's perception of merger activity in the four decades since the Supreme Court's ruling in the Los Angeles supermarket case. Over time, the view that market efficiencies matter and that consumer welfare is actually enhanced by a measure of industry concentration has slowly gained acceptance, although there still are loud complaints from consumer groups that this or that merger will result in higher prices. In truth, markets remain highly competitive even after such concentration, and industries that have experienced consolidation have seen prices remain stable or actually fall. To be sure, profits are generally higher in concentrated industries, but the prices consumers pay may actually decline. This evidence suggests that efficiency gains are a prime driver of greater profitability and market evolution.
For that evolution to be sustainable, markets need both growth and efficiency. Growth comes primarily from understanding and shaping customer demand, whereas efficiency is a function of operations. Through the cyclical pursuit of these objectives, markets become organized and reorganized over time.
Once its basic viability has been established, a start-up industry enjoys high growth but has low efficiency. No matter what criterion is used to measure efficiency -- revenue per customer, revenue relative to assets deployed, revenue per employee, for instance -- the start-up costs are high. The first shakeout occurs during the industry's initial growth phase to make it more efficient without sacrificing growth. Subsequent attempts to make the industry more efficient come from four key sources or events: the creation of standards, the development of an industry-wide cost structure as well as a shared infrastructure, government intervention, and industry consolidation through shakeouts. These four drivers force the industry as well as the players in it to become more and more efficient in order to stay competitive. As we will see in this chapter, they can occur at any time and in any order, sometimes independently, sometimes closely dependent on each other. Their primary effect, however, is to promote efficiency and fair competition within an industry such that no one company becomes a monopoly.
In subsequent shakeouts, the industry is reorganized for growth, typically through market expansion, including globalization. Driven primarily by investor demands, companies at this stage are concerned with growth of all kinds: revenue growth, cash flow growth, earnings growth, growth in the number of customers and revenue per customer, and growth in market capitalization. To continue to attract investment capital and growth, the industry needs to make productive use of all inputs, including capital, labor, and management talent.
The Creation of Standards
Market inefficiency can hasten the creation of de facto standards. Henry Ford paved the way for one such standard when he devised the highly efficient assembly-line manufacturing process for the Model T. Bill Gates was fortunate indeed when Microsoft received the nod from IBM and others to make the MS-DOS operating system the standard for personal computers. Once that standard was set, even Big Blue, known primarily for its hardware, could not wrest away control with its proprietary OS/2 system.
When standards play a major role and remain largely proprietary, there may not be room for three separate platforms. Typically at most two platforms can survive in the broad market: VHS & Beta for video recorders, VHS-C and 8mm for camcorders, PAL and NTSC for television broadcasts, CDMA and GSM for wireless telephony, PC and Mac for personal computing. Eventually, one platform becomes dominant, if not universal. Thus, 8mm has a big lead over VHS-C, PCs have triumphed over the Mac, and VHS has overwhelmed Beta. The other platform, if it survives, is relegated to a niche market.
The simultaneous existence of two or more standards, as in the case of NTSC and PAL, can be attributed in large part to protectionist ideologies and government regulation. Thanks to a double standard in the worldwide electric industry, tourists must contend with shifting between 110 volts and 220 volts, not to mention remembering to pack a variety of prongs and socket styles; in Europe alone there are some 20 different types of electrical plugs currently in use. To the delight of those tourists, these types of essentially meaningless and highly inefficient differences will start to go away as the electric industry adopts universal standards and the world at large becomes more driven by market economies. The cost of converting to a new single standard, however, is estimated to be $125 billion!
Already we can see the power of a fully adopted worldwide standard in the World Wide Web. The extraordinarily rapid diffusion of this technology across the globe has resulted in large measure because of that single standard. Emerging industries today are highly cognizant of this fact, and organizations that set industry standards now occupy an influential place in the world economy. The impact of evolving standards is illustrated by the stories of the evolution of the VCR industry and the development in Europe of the Group Special Mobile (GSM) network.
The VCR Industry
Based in Redwood City, California, the Ampex Corporation invented video tape recorder (VTR) technology in 1956. It sold machines to professionals initially for $75,000, but it was never successful in creating a product for ordinary consumers. However, it was successful in licensing its technology to Sony, which turned it into a competitive advantage.
Sony first introduced videocassette recorders (VCR) to the mass market in 1971, but even its "U-Matic" machines and cassettes were too big and expensive. Accordingly, Sony made modifications and repositioned the machines for industrial users. Next, Sony approached JVC and Matsushita -- two of its biggest competitors -- about establishing a standard (based on a new Sony technology) that would reduce the size of both machines and cassettes. JVC and Matsushita would accept only the U-Matic format, and JVC refused to cooperate or compromise on technology for smaller machines.
In 1971, JVC established a video home system (VHS) project team and charged it with the mission to develop a viable VCR for consumers, not just one that was technologically possible, but something consumers would prefer. Experimenting with ten different ways of building a home VCR, Sony settled by mid-1974 on the Betamax prototype. It set up a new plant to produce 10,000 units a month, but designed the machine to record for only one hour, reasoning that customers would use it to record television programs for later viewing. Later, when Sony asked Matsushita and JVC to adopt the Beta format, both refused, citing the one-hour recording limit as a major drawback. JVC's VHS format, then in development, would deliver up to three hours. After the Betamax was launched, Hitachi tried unsuccessfully to license Betamax technology from Sony, which basically had decided to go it alone.
Meanwhile, JVC formed an alliance of companies around the VHS standard before it shipped any products. The group included Matsushita, Hitachi, Mitsubishi, Sharp, Sanyo, and Toshiba. The standards war was on, and not even the intervention of Japan's Ministry of International Trade and Industry (MITI) in 1976 could succeed in resolving the dispute.
After JVC's launch in October 1976, Sony recruited Sanyo and Toshiba to join the Beta group. The split between the two formats continued for another ten years. Sony did well at first, in part because of its wide distribution. In 1976 and 1977, its market share was over 50 percent, but the company lost ground quickly. By late 1978, Matsushita with 35.8 percent of the market overtook Sony, whose share had slipped to 27.9 percent. By 1988, VHS had close to 95 percent of world sales. In a show of pragmatism, Sony launched its own line of VHS machines and repositioned Betamax as a high-end system for professionals.
Group Special Mobile (GSM) Network
The development of the Group Special Mobile (GSM) network has been an essential element in the success of European wireless companies such as Finland's Nokia and Sweden's Ericsson. Analog cellular telephone systems grew rapidly in Europe in the 1980s, especially in Scandinavia and the United Kingdom. Because each country developed its own sophisticated systems and networks, the industry was characterized by incompatible equipment and operations. Since mobile phones could operate only within national boundaries, the limited market for each company's equipment meant that economies of scale were poor. It was not unusual to see executives toting multiple phones depending on the country in which they happened to be conducting business at the time. The imminent creation of the European Union (EU) made this highly inefficient situation untenable.
In 1982, Nordic Telecom and Netherlands PPT proposed to the Conference of European Posts and Telegraphs (CEPT) that a new digital cellular standard be developed that would improve efficiency and help the industry cope with the explosion of demand across all of Europe. The CEPT established a body known as Group Special Mobile to develop the system. Members of the European Union were instructed to reserve frequencies in the 900MHz band for GSM to enable easy "roaming" between countries. In 1989, the European Telecommunications Standards Institute (ETSI) offered GSM as an international digital cellular telephony standard.
GSM service commenced in mid-1991. By 1993, there were 36 GSM networks in 22 countries. GSM was successful in gaining acceptance in non-European markets as well, since it was the most mature mobile digital technology. GSM also proved very successful in Asia, with its huge untapped markets that had no analog legacy to overcome. By 1997, over 200 GSM networks were running in 110 countries, with more than 55 million subscribers. As of January 2001, 392 GSM networks were operational in 162 countries, with dozens more planned. GSM had 457 million subscribers (up from 162 million a year and a half earlier) out of 647 million digital subscribers; another 68 million subscribers continued on analog systems.
The biggest holdout has been the United States, where the government has played no role in selecting a standard, and where a major rival to GSM, CDMA, has won many converts. Overall, the U.S. market is split among three standards: CDMA, GSM, and TDMA (a standard similar to GSM, but incompatible with it). By September 2000, CDMA had 71 million subscribers, whereas TDMA claimed 53.5 million. Each network operates independently of the others.
While many technology experts argue that CDMA is a superior technology, the advantage appears to be with Europe at this point. Simply put, GSM phones are much more usable worldwide. This wider usage base has allowed Europe to move ahead in phone functionality. Nokia is leading the charge, pioneering Internet access on cell phones. Through infrared technology, phones can transmit data to each other or to a machine; in Finland, this technology can be used to purchase a Coke from a soda machine. CDMA's acknowledged technological superiority is similar to that enjoyed by Betamax and the Macintosh. As history has taught us, neither was able to prevail.
The battle between CDMA and GSM may well be settled as we move to the next generation of wireless technology: so-called 3G or third-generation wireless systems featuring very high data transmission rates that will allow for two-way video communication. It is expected that most mobile operators will converge on a single worldwide standard for 3G systems.
Industry Cost Structure and Shared Infrastructure
The prevailing cost structure in an industry -- those costs primarily related to production, and to some extent to management and marketing -- has a deep impact on whether and how soon that industry becomes organized. This impact can be measured in terms of the relative significance of the industry's fixed costs versus its variable costs. As an industry emphasizes automation, incorporates new technology, and tries to mitigate the high or growing cost of human capital, it tends to increase fixed costs and lower variable ones.
Participation in an industry always has certain requisite fixed costs. In all aspects of business -- from procurement to operations to marketing -- relative market share determines spending efficiency. Thus, when it comes to national advertising and sales, for example, a company that has a 40 percent share of the market is potentially four or more times more efficient than a company with a 10 percent share. These are examples of fixed costs; that is, a company incurs them regardless of how high or low its sales are. Once a company has made the decision to target a particular market, it has to pay the piper no matter how great or small revenues promise to be. As we have observed, those industries in which such fixed costs tend to dominate are more likely to exhibit a pronounced Rule of Three structure.
If the costs to participate are high, the "minimum efficient scale" needed to attain efficiency in operations is also high. As a result, the shakeout in the industry happens sooner rather than later. In contrast, markets in the so-called agricultural age were characterized by near perfect competition: many small producers and buyers interacted in the marketplace, where prices were set according to the relative balance between supply and demand. The agricultural sector has predominantly variable costs: the costs of seed, fertilizer, and labor can fluctuate depending on conditions, but are always linked to the volume of production. About the only fixed cost is the cost of land, which is typically inherited in many countries. During the agricultural age shakeouts were kept to a minimum. As the farms have become commercialized, we see an economy of scale developing and the exit of family-owned businesses and farms.
Cost structure also makes its impact felt through the supply function. If the supplier industries enjoy significant economies of scale because of their cost structure, downstream industries also feel the pressure to consolidate, even though their own cost structure may not require or adequately support such a move. The two major suppliers to the personal computer industry -- Microsoft and Intel -- are dominant in their respective spaces, for example. Despite the lower entry and exit barriers associated with PC assembly, this dominance still creates pressures for concentration downstream.
Likewise, a high concentration of customers puts additional pressure on the industry to consolidate. In the industry comprising defense contractors, where the U.S. Department of Defense is by far the overwhelming customer, the number of defense contractors has fallen steeply in recent years. Also, a substitute industry that has a higher fixed-cost component will enjoy a price advantage. This too creates pressures on an industry to consolidate and become more fixed-cost intensive.
Although many people assume that fixed costs are bad for business, this is not necessarily the case. As the primary source of scale economies, fixed costs are an essential element in the competitive strategy for volume-driven players such as full-line generalists.
A Shared Infrastructure
In addition to fixed and variable costs that individual players in a market must consider, the market as a whole can move toward greater organization by developing a shared infrastructure for the purpose of increasing efficiency. Infrastructure costs are generally too high to be loaded on the transactions generated by any one company. Banks, for example, would be unable to survive if they did not share an infrastructure for check clearing, as well as for credit card authorization (through the Visa and MasterCard systems). Similarly, airlines require shared infrastructures for reservations, air traffic control, baggage handling, and ground services. Fundamentally such an infrastructure distributes the heavy cost of implementation, thereby making the system more affordable for all the players, large and small, in the industry.
To be useful in enhancing efficiency, an industry infrastructure must be:
By far the most significant recent example of a shared infrastructure is the Internet. Regarded as the most significant invention of our time, the Internet has become a major new infrastructure for virtually all businesses of any size, whether new or old. From an obscure tool used by researchers and academics at government-funded laboratories and universities, the Internet has exploded into the world of commerce. The starting point was a simple but brilliant innovation: the World Wide Web (see sidebar "Berners-Lee and the World Wide Web"). By general agreement it is comparable in its impact to the invention of movable type by Johann Gutenberg almost 600 years ago.
Berners-Lee and the World Wide Web
Tim Berners-Lee worked as a computer scientist at CERN, the international particle physics lab in Switzerland. It was his innovative idea that became the basis for the World Wide Web. In 1973, Vint Cerf and Bob Kahn had devised the Internet feature called Transmission Control Protocol / Internet Protocol (TCP/IP), which has been described as "one of the great technological breakthroughs of the twentieth century."
Berners-Lee came up with two simple innovations that enable people to navigate between previously unrelated sources or Web sites. Building on the Internet technology, he created a global hypertext system by inserting links from one text to another. He named one of his innovations the Hypertext Transport Protocol, now better known to Web surfers in its abbreviated form, http. In addition, he devised a way of identifying a document using the Uniform Resource Locator, or URL. Today these are common terms used in Internet traffic, although the public may not know their full names and functions.
Created in 1989, the Web is arguably an essential element of the infrastructure, not just for business and commerce, but also for governments, personal communications, community formation, and entertainment. As with the ideal infrastructure, it is not controlled by any one commercial entity, but evolves through the collective efforts of many. Forums of engineers, such as the World Wide Web Consortium, ensure that it functions well and evolves as needed. No company can unilaterally dictate that new features be added; nevertheless, standards are set faster than ever and are completely open. Because of this openness and malleability, the Internet has led to innovations at an incredible pace. MP3 is today's standard for compressing music files. The Java programming language has a place in practically all Web sites. Numerous other examples -- digital subscriber lines (DSL), broadband, electronic mail, teleconferencing, and the like -- indicate how fast this industry is moving in supplying products and services to individuals the world over.
Igniting one of the greatest explosions of wealth in history, the Web has also transformed the business community. The transformation has been both internal and external. Intranets, for example, have streamlined internal operating processes. Through extranets companies have developed closer linkages with their suppliers, alliance partners, and customers. The Web has fueled the growth of categories of commerce such as person-to-business and person-to-person.
Government Intervention
So far, most governments have resisted the temptation to try to control the Internet or regulate its functions. At the urging of their constituents, government officials have preferred to adopt a hands-off approach. Nevertheless, the government can and often does play an important role in determining an industry's structure, including triggering major consolidation. Often the government itself is a major customer -- the Department of Defense exemplifies a customer with deep pockets. The significance of the government's role as a buyer is even more pronounced in Europe than it is in the United States.
A major funder of research and development as well as a major buyer, the federal government has a significant impact on the pace and direction of technological change in many industries. In some cases, the government also facilitates cooperation within an industry, especially at the "pre-competitive" stage. Japan's Ministry of International Trade and Industry (MITI) has been the most prominent example of this kind of facilitator, although governments in Europe and the United States have participated in similar cooperative efforts.
For other industries -- for example, education, health care services, and computers -- the government helps to move the industry toward standardized products and processes. The government may intervene, for instance, if it sees that an important market is failing to achieve efficiency on its own. When too many companies were laying cable in the telephone and communications industry, each hoping to gain monopoly power by establishing itself as the leader with proprietary products, the U.S. government intervened by creating standards or sanctioning "natural monopolies" to generate efficiency. A similar intervention in the U.S. railroad industry established a much-needed standard for operations and had immediate effects on the players' profitability.
The Railroad Industry
In the middle of the nineteenth century, the railroad industry took off in the United States. Long before anyone had an inkling of the automobile industry, people saw a "natural" fit between the railroads and the physical size of the country with its vast stretches of undeveloped land. The railroads, however, developed haphazardly, primarily because the industry was so fragmented with many small, inefficient players and because there were no uniform standards. The most telling omission was that the U.S. railroad industry lacked a uniform gauge (the distance between the tracks). Goods had to be transferred between railroad carriers at points where rail lines of different gauges intersected -- a highly expensive and inefficient procedure.
The U.S. government, understandably, was concerned with the speedy construction of the railroad system. In the 1850s, federal, state, and local governments stimulated the growth of the industry, granting charters (or in some cases actually building the lines), as well as providing money and credit for many private railroads. The federal government conducted surveys at taxpayer expense and reduced the tariff on iron used by the railroads. Before 1860, the government provided almost 25 million acres of land for railroad construction, with two main stipulations: (1) the railroads would transport government property and Union troops for free, and (2) Congress would set rates for mail traffic. The federal land grant program expanded rapidly after the Civil War ended in 1865.
Battles and explosions during the war significantly damaged the railway system, destroying miles of track and rendering equipment unusable. After the war ended, government officials and industry executives wisely undertook a rehabilitation program that at last specified a standard gauge of 4 feet 81?2 inches for all tracks. By 1880, 80 percent of the mileage had been converted to this standard. By 1890, virtually the entire network was brought into compliance with the new standard, thereby insuring that the railroad industry became both more efficient and extended its reach to more remote regions. Now that everybody was running on the same track, the railroad companies themselves became much more serious targets for mergers and acquisitions. Accordingly, the industry rapidly became more concentrated.
The railroads increased their hold on power, such that demands for the regulation of the industry grew loud and urgent. In 1887, the federal government passed the Interstate Commerce Act, creating the Interstate Commerce Commission (ICC), which became a major force in the development of a federal regulatory policy.
The rail industry peaked in 1920; after that date, other modes of transportation -- particularly the automobile and the airplane -- reduced the importance of the rail system. In the 1920s severe competition from outside the industry caused many passenger railroads in their prime to cease operations. Only Amtrak would be reborn some 40 years later, and only then because of massive tax subsidization.
Industry Consolidation
Over the last several years, we have witnessed a record number of mergers, as well as numerous demergers (the spinning out of noncore businesses). As a result, the landscape of just about every major industry has changed in a significant way. The pace of this consolidation is startling: the number of mergers per year in the United States has more than tripled over the past decade, while the value of those mergers has risen tenfold. Between 1997 and the end of 2000, nearly $5 trillion in mergers took place in the United States alone. The most recent large mergers and acquisitions have occurred in the telecommunications, banking, entertainment, and food industries, as indicated in the accompanying tables.
While the United States has been at the forefront of this trend, M&A activity has been feverish on the global level as well. At the time, few experts believed that 1998's record of $2.52 trillion in global M&A activity would be soon broken; however, total worldwide transactions announced in 1999 reached $3.43 trillion, exceeding the previous record by an astounding 36 percent. In 2000, the total reached $3.5 trillion, growing only slightly over 1999 activity. The uncertain market environment in late 2000 and early 2001 has dampened merger activity worldwide; however, we expect that it will rebound as markets recover. Appendix 1 presents an encapsulated history of merger activity in the United States during the twentieth century.
Europe has been a particularly fertile area for some of these recent megadeals, particularly in telecommunications, utilities, banking, and the retail sector. M&A activity in Europe more than doubled in 1999, totaling $1.2 trillion. This total includes United Kingdom-based Vodafone Airtouch's $203 billion offer for Germany's Mannesmann AG, the largest deal ever. France's two largest retailers and hypermarkets, Carrefour SA and Promodes, merged to form a $52 billion giant, now the world's largest retailer after Wal-Mart. The globalization of retailing, long believed to be an industry unlikely to globalize, appears to be well underway; Carrefour and Promodes are already prominent across Europe as well as in Latin America. Likewise, Arkansas-based Wal-Mart has been expanding south into Latin America as well as east into Europe.
Even Japan, a nation for years thought to be an uncongenial place for mergers, is experiencing a much accelerated pace of M&A activity. Because Japanese markets and culture did not generally support mergers, most of the country's industries experienced a lot of fragmentation. In the past Japan's extremely low cost of capital and its cozy keiretsu relationships have contributed to keeping an excessive number of full-line generalists afloat. A proliferation of major players is evident in most industries: for example, seven major camera makers (Canon, Nikon, Asahi Pentax, Minolta, Yashica, Fuji, and Konica); seven big car companies (Toyota, Nissan, Honda, Mazda, Mitsubishi, Subaru, and Isuzu); and several consumer electronics companies (Sony, Matsushita, Hitachi, Mitsubishi, and Toshiba). Gradually, however, merger activity has been on the increase. In 1999, M&A volume in Japan tripled over 1998 levels, though still amounting to only $78 billion. Mergers in Japan are starting to focus on industry consolidation and the "unbundling" of conglomerates.
As more industries globalize, a larger percentage of mergers involve firms from different countries. Such cross-border M&A activity has risen fivefold over the past decade. In terms of total value cross-border mergers reached $720 billion in 1999. As a share of world GDP, they increased from 0.5 percent in 1987 to 2 percent in 1999. Industries that previously could not expand in such a manner for operational reasons are now able to do so. Retailers, for example, can use new technologies to manage cross-border supply chains and centralized purchasing for multiple countries.
Recently NationsBank completed its merger with BankAmerica in a $60 billion stock deal. SBC Communications acquired Ameritech for $62 billion in stock. British and Swedish drug groups Zeneca Group plc and Astra AB announced plans to join forces in what was until then Europe's largest merger, following Hoechst and Rhone Poulenc's merger of their life science units to form Aventis, and an all-French merger between Sanofi and Synthelabo. Ciba Specialty Chemicals and Clariant, two of the largest players in the rapidly growing specialty chemical industry, are merging. Exxon and Mobil combined to form the world's largest oil company, fast on the heels of the merger between BP and Amoco (Royal Dutch/Shell rounds out the major players in that industry). The European banking sector, following economic and monetary union, is rapidly consolidating across national boundaries. French banks Société Générale and Paribas have announced plans to combine to form Europe's second biggest bank, behind the Deutsche Bank/Bankers Trust merger of 1999 and ahead of Switzerland's UBS AG.
Clearly we are witnessing a reorganization of the patterns of corporate ownership, as well as the risks involved in business participation -- namely, those businesses a company should enter as opposed to the ones it should exit. The current wave of mergers and de-mergers represents a historic rationalization of "who does what and for whom." In general, the result is improved market efficiency, lower prices for customers, and higher returns for investors.
Industries tend to become more efficient as they undergo consolidation. In a highly fragmented market, especially one in which growth has begun to slow, numerous small, inefficient players recognize that it is to their advantage to join together or combine with larger companies that can command greater economies of scale and scope. The drive for efficiency transforms an unorganized market with myriad players into an organized one in which the number of players rapidly drops. By acquiring small companies (as General Motors did in the automobile industry) or by creating a de facto standard (as Ford did in the assembly-line process of building the Model T), one player makes the turn and becomes a broad-based supplier. From this point in the market's evolution, the Rule of Three comes into play. In most cases, two additional players are also able to evolve into full-line generalists.
The Software Industry
The personal computer software industry started up in the early 1980s. At the outset, there were hundreds of small, mostly anonymous firms vying for position. Essentially a cottage industry, software was primarily a technology business, and scale was not much of a factor. In a fateful decision, IBM selected Microsoft to provide the DOS operating system for its personal computers, thereby giving Gates's company the enormous advantage of owning the dominant standard. Over the next several years, three other companies emerged as significant players, each as a product specialist: Lotus, which had acquired spreadsheet technology from VisiCalc; WordPerfect, which fast became synonymous with word processing, and Novell, which staked out an early position in the networking arena.
Microsoft gradually leveraged its extraordinary advantage in operating system software to establish a commanding position in applications. Although it was initially unable to challenge Novell in networking with LAN Manager, it developed competitive products in word processing (Word) and spreadsheets (Excel). Microsoft was the first to sell software applications in bundled form, inventing the concept of a "suite" of applications that shared some features and allowed information to be readily transferred and accessed across them.
Microsoft thus became the first full-line generalist in the market, setting in motion an inevitable restructuring of the entire industry. Lotus, for example, soon realized that if it wanted to continue to grow, it had to reduce its overwhelming dependence on a single product (Lotus 1-2-3) and broaden its product line. By acquiring the word processor Ami Pro and developing the presentation graphic package Freelance, Lotus became the industry's second full-line generalist. WordPerfect was even more dependent on its namesake word processor than Lotus had been on its spreadsheet; it tried, but failed, to develop a viable full line of products on its own, including PlanPerfect and WordPerfect Presentations. Finally, it was forced to merge with Novell. Even so, the duo had to acquire Borland's Quattro Pro spreadsheet to complete their package of offerings.
Over time, the Big 3 added database, electronic mail, and many other categories of software to their lines. Although the market still included hundreds of specialists, they essentially ceded the large applications -- word processing, spreadsheets, presentation graphics, databases, networking, and electronic mail -- to the Big 3. Gradually, however, Microsoft's dominance in operating systems, superior marketing, and overwhelming financial advantage increased its dominance in the office suite domain to well over 90 percent of the market. It thus left its two main competitors with a share of less than 10 percent of the market to divide between them, in effect forcing both of them into the ditch.
To be sure, the poor execution of its competitors helped Microsoft achieve this high level of success. WordPerfect, for example, could have leveraged a major asset -- its enormous number of devoted customers -- to expand its product offerings in the word processing market. Instead, the company made a classic mistake: it failed to develop a version of its program for the Windows operating system until two years after Microsoft had delivered Word for Windows. By then, it was so far behind that it could never catch up. When WordPerfect later created a suite of its own by coupling its word processor with Borland's spreadsheet, the applications lacked common controls and made little headway against Microsoft's smoothly integrated products.
The U.S. Airline Industry
After World War I, several European aviation companies hired wartime pilots to fly decommissioned warplanes along the first commercial air routes. Aided by heavy subsidies from European governments, a number of well-known commercial airlines such as British Airways, Air France, and KLM began operations during the 1920s.
In the United States, airlines emerged primarily as a result of the U.S. Post Office's attempts in 1919 to establish a nationwide airmail service. In fact, the Post Office played a leading role in setting up the system of airports across the nation. In 1925, Congress passed the Air Mail (Kelly) Act, authorizing the postmaster general to use private contractors to provide airmail service. The creation of a number of private air transport companies was not far behind, some of which began carrying human beings as well as the mail.
In response to this increased activity, Congress passed the Air Commerce Act of 1926 and instructed the secretary of commerce to "foster air commerce, designate and establish airways, operate and maintain aids to air navigation, license pilots and aircraft, and investigate accidents." As a whole, however, the American public was too enamored of the automobile and the Roaring Twenties to take much interest in flying. Then in 1927 Charles Lindbergh captured headlines in his solo transatlantic flight to Paris. Suddenly air travel became the rage, and new companies seemingly sprang up overnight: Pan Am and TWA were both founded in 1928; Delta followed in 1929; American Airlines was formed in 1930 out of the combination of many small mail carriers; and United Airlines was created in a merger of several older mail carrying operations in 1931.
Boeing and Lockheed introduced the first planes specifically designed for passenger service. Douglas Aircraft dominated the skies with its DC-3s, DC-4s, and DC-6s, but in 1957, Boeing beat Douglas in building the first commercial jetliner. For a time the launch of larger aircraft lowered the cost of air travel. The number of passengers grew from merely a few thousand in 1930 to about 2 million in 1939. By the end of the 1940s, the number of air passengers topped 16.7 million.
Regulating this new industry, the Civil Aeronautics Board (CAB) was authorized by the Federal Aviation Act of 1958 to establish routes, fares, and safety standards. In addition, the CAB heard complaints from the traveling public and settled disputes with the airlines. Dissolved in 1984 as part of the government-directed deregulation of the airline industry, the CAB in effect gave up its responsibilities to the Federal Aviation Administration (FAA), which was entrusted with overseeing the air traffic control system, certifying pilots, and establishing standard safety precautions for the industry.
Deregulation allowed the airline companies to set their own routes and, after 1982, their own fares. When the competitive forces were at last unleashed, the industry experienced rapid changes, fare wars, new incentive plans to placate employees, and innovative promotions to attract customers. Many new airlines were spawned in the deregulated industry, increasing from 36 in 1978 to 96 in 1983, most of them serving rather localized geographical niches. Between 1980 and 1983, as companies tried to compete on low prices and waged fare wars even when new competitors were flying into the market, the industry suffered losses of $1.2 billion.
American Airlines introduced its AAdvantage frequent-flier program in 1981. Lower fares and heightened competitive activity in the 1980s led to rapid industry growth in terms of customers served: an increase from 297 million passengers in 1980 to over 455 million in 1988. A decade later, that number rose to a record 551 million passengers.
The financial problems that many airlines faced led to increased labor strife, bankruptcies, and for some carriers the prospect of being acquired. Delta bought Northeast; Pan American took over National; TWA acquired Ozark Airlines in 1986; Northwest gobbled up Republican; US Airways pocketed Pacific Southwest. Texas Air/Continental acquired People Express and Eastern Airlines, which shut down entirely in January 1991 after having operated two years under Chapter 11 bankruptcy provisions. In 1987, Delta bought Western Airlines. In 1989, US Airways acquired Piedmont. Continental, America West, and Pan American entered Chapter 11 in 1990 and 1991, but only the first two emerged to resume full operations. For some time TWA managed to keep body and soul together, but was eventually acquired by American Airlines in March 2001.
Consolidation of the airline industry continues both in the United States and in Europe. The current Big 6 in the United States appear close to becoming the Big 3, dividing up nearly 85 percent of the domestic market. This consolidation will happen primarily through the mergers of several ditch airlines with one of the current market leaders, United, American, and Delta. The first salvos have already been fired: in addition to American's acquisition of TWA, UAL Corporation, the parent of United Airlines, announced in May 2000 its intention to buy US Airways Group. Meanwhile, Delta and Continental are in discussions to merge. With the advent of truly "open skies," the global consolidation of this industry is not far off.
The Pharmaceutical Industry
In the $300 billion global pharmaceutical industry, approximately 100 firms struggle for survival. The world leaders in drug discovery, U.S. pharmaceutical companies currently develop about half of all new medicines, accounting for about 40 percent of the market. European giants round out the top ten firms. But major changes now occurring in the industry illustrate the enormous effects of the four mechanisms discussed in this chapter. Consolidation over the past 15 years has whittled the more than two dozen multinationals down to about 15. Companies are exiting non-health-care businesses, increasing spending on research and development, acquiring or partnering with genomic and drug discovery companies, growing their sales forces, and increasing advertising expenditures.
Which firms will be the victors? Which will be driven from the market? Currently, all major pharmaceutical companies are in or near the ditch. The largest, Merck, commands a meager 10.9 percent of the market. Growth in the industry is a direct result of new products (innovation). New drugs, however, do not come without the high risk and price of R&D. An average of twenty cents of every dollar of revenue is reinvested in R&D, but only one out of every 250 drugs that enter preclinical testing ever makes it through the approval process. The average time-to-market is 12 years, an eternity in any industry. Only a third of the approved drugs recover the cost of their research and development. When the cost of failures is amortized over that of those few successes, the estimate for bringing a new drug to market amounts to $500 million. Despite such obstacles, the demand for new drugs keeps rising.
Three key factors in today's marketplace are creating demand. First, customers -- particularly those over age 65, a group that consumes three times as many drugs as those under 65 -- eagerly await new product releases. It is no surprise that in the past decade over 150 new medications have targeted diseases of the elderly, and currently there are more than 600 drugs in R&D aimed at seniors. Yet according to the World Health Organization, three-quarters of the 2,500 currently recognized medical conditions lack adequate therapies. With the rapid increase in the world's senior population, the demand for pharmaceutical products for society's aging will continue to rise at staggering rates.
Second, enrollment in plans such as health maintenance organizations (HMOs) and preferred provider organizations (PPOs) has swelled in the past twenty years. Managed care, which in the 1980s had approximately a 30 percent share of the pharmaceutical market, now covers 83 percent of private-sector employees. The share of market will soon reach an estimated 90 percent. It is now widely accepted that effective self-care is much more cost-efficient than treatments requiring hospitalization or surgery. Given that the leading-edge, branded drugs are fundamental in effective self-care, it is understandable that the pharmaceuticals are interested in responding to the increase in demand.
Third, since the Food and Drug Administration (FDA) has relaxed restrictions on direct-to-consumer advertisement over the past three years, advertising for drug products has surged. For instance, in 1998 Schering-Plough spent $200 million advertising the allergy pill Claritin to consumers. In 1999, pharmaceutical companies spent $1.8 billion on advertising to consumers with $1.1 billion of that going towards TV ads -- a 40 percent increase over 1998 ad budgets. The result of these campaigns is increased diagnosis and treatment (with drugs) of many unreported diseases and ailments. In fact, heavily advertised products enjoy an average increase in sales of 43 percent compared to 13 percent for those products not heavily advertised.
The Rule of Three identifies four key processes by which growing markets become efficient: creation of standards, shared infrastructure, government intervention, and consolidation. In the pharmaceutical industry all of these four processes are in play. In the United States, the FDA sets stringent standards for product safety and efficacy of drug products. Other countries and unions of countries have similar organizations. Shared infrastructure is provided by government and privately funded research organizations from around the world. The National Institutes of Health (NIH), for instance, furnishes basic scientific research to industry in the United States. Government intervention provides a level of protection for discoveries, unique processes, and intellectual property through patent laws. By allowing a short-term monopoly on a product, the innovative firm can recoup exorbitant R&D expenses.
Finally, the industry is consolidating as the Rule of Three predicts. In the past decade, there have been 27 consolidations of significant pharmaceutical companies and numerous consolidations of smaller firms. Acquisitions and alliances between big pharmaceutical and biotechnology companies have also taken place.
A leading cause of this industry consolidation is shareholder demand for high return in exchange for high risk. In evaluating a possible merger, firms look for synergies such as those that brought Pharmacia and Upjohn together in 1995. Pharmacia had many drugs in its pipeline but was weak in U.S. marketing, whereas Upjohn was just the opposite. The merger of the two companies produced a single entity with a pipeline full of products, a strong U.S. marketing presence, and $1 billion available for R&D.
Many pharmaceutical companies look to acquire competitors who have core competencies that differ from their own. Technologies such as drug delivery, drug discovery, and genomics characterize biotechnology companies but are lacking in most big pharmaceutical companies. Allowing for less expensive and more rapid development of novel therapies, these technologies complement the pharmaceutical industry's core competencies. The top 20 pharmaceutical companies combined have alliances with over 1,000 biotechnology companies.
Competitive pricing is another driver of industry consolidation. In most European countries and Japan, governments have strict pricing controls, profit controls, and prescribing controls. Such policies greatly reduce revenue and profitability for pharmaceutical companies. The United States is the only major market where pharmaceuticals are not yet restricted by government policies, but there are other agents of price controls at work. Contract purchasing by HMOs and prescription benefits managers (PBMs) has brought competition based on pricing to a new high. Replacing physicians as the gatekeepers for prescription drug allocation, powerful buying groups now demand lower prices and greater use of generic drugs. They generate preferred drug lists, or formularies, to which patients' benefits are directly linked, and they dictate which drugs can fill clients' prescriptions. The drug companies must acquiesce if they want their products to be included on the preferred drug list, even if they are sold at reduced prices.
Reduced prices, of course, reduce earnings. In response, industry consolidation can provide broader product lines and economies of scale, thus empowering the drug manufacturers in negotiations with buying groups and government agencies.
Price wars are fought on three fronts: between brand name drugs and generics, between brand name drugs in the same therapeutic category, and between comparable generics. In the United States, changes in regulatory policies have increased competition in all three areas. Each of these price wars contributes to further industry consolidation. Market share for generic drugs rose from 18 percent in 1984 to 47 percent in 1999 -- a rise attributed to both the purchasing power of managed care and the 1984 Hatch-Waxman Act, which abbreviates the FDA approval process for generic drugs and allows manufacturers of generic drugs to conduct their testing prior to the expiration of the brand name drug's patent. This provision has reduced barriers to market entry by lowering the cost of clinical testing and accelerating the time-to-market from the previous industry standard of three years to three months. Although patent protection is initially issued for 20 years on new brand name drugs, most new drugs are patented early in the development and approval stages. Thus, when a new drug finally enters the market, only 11 years of patent protection, on average, remain. Once a generic drug is available, sales of the brand name drug drop typically by 60 percent.
Competition between similar brand name drugs has become much more fierce because rival pharmaceutical companies have adopted fast-follower strategies. With the recent advances in information technology and drug discovery technology, the period between the introduction of a breakthrough drug and the fast follower brand name drug can be less than a year. For example, Celebrex, an arthritis medication from Pfizer, was approved on December 31, 1998. It had less than five months of true market exclusivity before Vioxx, a similar arthritis drug from Merck, entered the market on May 21, 1999. This fast competition plays a significant role in keeping prices and earnings down.
In addition, generic drugs compete with each other on price. As numerous generic products become available for a particular drug, prices are driven down. Although there are a host of independent generic drug companies, it is important to note that many of the largest pharmaceutical companies own generic subsidiaries or divisions. For example, Novartis owns Geneva Pharmaceuticals.
Other pharmaceutical companies find success in the marketplace by carving out their own niches. Certain specialties, such as cancer therapies, will most likely provide a major niche in the industry. Because cancer drugs do not require as much sales and marketing effort as other classes of drugs, companies currently specializing in these treatments or other niche market segments are more likely to remain independent.
While coalescing into a handful of large players, the pharmaceutical industry has also been exiting non-core businesses. Novartis's crop-protection and seed businesses, for example, were spun off and merged with AstraZeneca's agrochemicals business to create Syngenta AG. In early 2000, Abbott Laboratories sold its agricultural products business unit to Sumitomo Chemical Company of Japan. In June 2000, American Home Products completed the sale of Cyanamid, its agricultural business, to BASF AG.
In spite of its being a patent-based industry, which is typically not susceptible to the Rule of Three, the U.S. pharmaceutical industry actually does not present an exception to the evolution of competitive markets that we see occurring in other industries. Patent protection does not provide an impenetrable shield against competition; growth and efficiency factors will lead to further consolidation. From the current leaders in this industry, a Big 3 will eventually emerge, but not until after a big shakeout reduces the number of legitimate players.
The Dynamics of Industry Shakeouts
As we have noted above in discussions of the agricultural age, industries in which there is "perfect" competition typically do not have to go through a major shakeout. Similarly, personal care and consumer service industries such as beauty shops, plumbing and heating companies, and repair shops have not experienced significant shakeouts. These industries are characterized by a lot of individualized attention, with a high degree of manual labor. In addition, their operations are not scalable. Where consolidation is such a major driver of industry organization, however, such events can have devastating as well as beneficial effects. It is important, therefore, to look at the causes and effects of industry shakeouts in more detail.
In recent years, few industries have escaped the destruction and turmoil resulting from shakeouts. Over the past two decades, victims of shakeouts (or their beneficiaries, depending on one's point of view) include airlines, automotive component producers, banks, biotechnology companies, boat builders, cable TV operators, construction contractors, defense contractors, department stores, health maintenance organizations (HMOs), hotels, minicomputer companies, newspapers, shopping malls, saving and loan companies, steel factories, trucking companies, makers of wine coolers, and wood-stove makers.
An industry is considered to have experienced a shakeout if 25 percent or more of its companies have disappeared within a short period. Like an earthquake, a shakeout brings major upheaval to an industry, changing its complexion, the mix of competitors, and the rules of competitive play. Also like earthquakes, shakeouts vary in their duration, intensity, causes, and effects.
When a shakeout occurs early in an industry's life cycle, as happened in the PC industry, it can indicate either the emergence of a dominant technological design or the existence of a "majority fallacy." In the first scenario, a major technological design is widely accepted by customers as the standard, but some companies do not or cannot adapt their manufacturing or marketing operations to match that design. Their inflexibility causes those companies to fail to meet customer expectations, leading ultimately to their exiting the industry. The shakeout rids the industry of weak competitors, giving survivors a bigger share of the growing industry.
In the second scenario, large numbers of individual entrepreneurs and established companies enter an industry, attracted primarily by its promise of quick growth and easy profits. They constitute a majority, persuaded that the industry has made its transition from startup to fast growth. Well-financed and managed, the early entrants proceed either to acquire the truly pioneering companies or to replace them. Because even a growing and prosperous industry cannot always fulfill the expectations of all entrants, many fail to achieve market success. As a result, a shakeout occurs, ridding the industry of weak new entrants and inflexible pioneers. The IBM-compatible personal computer market, which in the mid-1980s went through such a shakeout, serves as an example of this fallacy.
A shakeout that occurs later in an industry's life cycle manifests a different combination of powerful forces. It often signals the industry's movement to the maturity stage, where demand plateaus, as happened in the tire industry, for example. This shift forces some companies to exit the industry because they fail to achieve appropriate profits. Moreover, as industry maturity approaches, product substitutes multiply and cause further declines in demand for an individual company's products. Further, mature industries often invite entry by foreign companies, forcing existing companies to scale down their expectations, exit the industry, or attempt to become globalized themselves.
Shakeouts in a mature industry can also be caused by "de-maturing." A technologically adept competitor can open a new competitive front by deploying technologies from outside the base industry. Typically, firms use some form of information technology to transform their business, by infusing intelligence and other attributes into their products or dramatically altering their production and operating processes. For example, in the late 1980s, Yamaha revived a moribund piano industry by developing a digital piano that could play itself (using instructions stored on a floppy disk), teach a novice how to play, or serve as a traditional piano. As a result, Yamaha altered the needed core competencies in the industry, and many competitors that lacked the requisite technological capabilities were forced to exit.
Industries may experience more than one shakeout. Because of the vast number of producers, the U.S. automobile industry experienced its first shakeout in 1920 and 1921. Having barely recovered from this shakeout, the industry experienced a second massive shakeout in the early 1930s, triggered by the Great Depression. Still a third shakeout occurred in the 1940s as a result of the exorbitant costs associated with competition in a growing national market. As we will see in chapter 2, this shakeout led to the consolidation of the industry in which three firms came to dominate the industry's sales.
The auto industry is not alone in experiencing multiple shakeouts. The financial service industry has undergone similar changes. The early deregulation of the industry in the 1980s led to a massive shakeout. Near the end of that decade, however, global competitive forces and technological advances led to another serious shakeout.
Clearly, with rapid global and technological changes, executives can no longer accept the folk wisdom that shakeouts result just from industry maturity. Industries may experience multiple shakeouts at different points in their evolution. These shakeouts often require different strategies in order to ensure company survival.
On the one hand, forces of technology and globalization most often cause sudden changes in industry structure. These forces can be linked, as when technology diffusion across countries causes seismic changes in an industry. For example, the emergence of global players such as Airbus in the aerospace industry led to industry realignment and the exit of marginal players such as Lockheed Martin and then McDonnell Douglas from the commercial aviation business.
On the other hand, market-driven shakeouts (through mergers and acquisitions) and those induced through gradual regulatory relaxation tend to have an evolutionary impact. In the case of a historically heavily regulated industry such as telecommunications, a complex network of forces is coming into play. All four of the drivers of change discussed in this chapter are having significant effects. Changes in regulatory policies in numerous countries are rapidly creating a highly globalized industry. Technological changes are coming at a rapid pace, driven by the convergent power of digital electronics. Customers' needs are escalating. The combination of these forces is leading to a shakeout in the industry on a global basis.
Early Warning Signs for Shakeouts
Forward-looking firms can anticipate an impending shakeout in their industry by observing the leading indicators of major change. Some of these indicators are industry specific. In the personal computer software business, for example, the sales of software development kits for a new operating system provide a strong signal of coming shifts in the industry. For the majority of industries, we can identify a number of "generic" indicators of major change and possibly an industry shakeout:
It does not take a prophet to recognize the signs of coming change, but the signals are many and varied, and they can be misinterpreted. Companies, like individuals, often see and hear what they want to see and hear. They interpret the world in self-serving terms. They view threats to the existing order with great alarm. Rather than investing real and psychological capital in the status quo, they would be far better served by adopting a "crisis imminent" mind-set, one that prepares them for an industry shakeout at any point. We take up later (chapter 8) the primary causes of market disruptions, but first we turn attention in chapter 2 to a deeper analysis of the triumvirates that dominate or that are in the process of forming in major industries throughout the free markets of the world economy. What is so special about the notion of three major players in a competitive market? Why are there sometimes more than or fewer than three in a given industry? And how does their dominance affect the typically smaller niche players that somehow find the means not just of surviving in a highly competitive market, but of doing quite well?
Copyright © 2002 by Jagdish Sheth and Rajendra Sisodia
In 1966, the U.S. Supreme Court refused to allow two supermarkets in Los Angeles to merge. The Vons Grocery Company and Shopping Bag Food Stores, had they been allowed to combine, would have controlled a whopping 7.5 percent of the market. Over 3,800 single-store grocers would still have been doing business in the city. In spite of these statistics, the Court ruled against the merger, citing "the threatening trend toward concentration."
Much has changed in the public's perception of merger activity in the four decades since the Supreme Court's ruling in the Los Angeles supermarket case. Over time, the view that market efficiencies matter and that consumer welfare is actually enhanced by a measure of industry concentration has slowly gained acceptance, although there still are loud complaints from consumer groups that this or that merger will result in higher prices. In truth, markets remain highly competitive even after such concentration, and industries that have experienced consolidation have seen prices remain stable or actually fall. To be sure, profits are generally higher in concentrated industries, but the prices consumers pay may actually decline. This evidence suggests that efficiency gains are a prime driver of greater profitability and market evolution.
For that evolution to be sustainable, markets need both growth and efficiency. Growth comes primarily from understanding and shaping customer demand, whereas efficiency is a function of operations. Through the cyclical pursuit of these objectives, markets become organized and reorganized over time.
Once its basic viability has been established, a start-up industry enjoys high growth but has low efficiency. No matter what criterion is used to measure efficiency -- revenue per customer, revenue relative to assets deployed, revenue per employee, for instance -- the start-up costs are high. The first shakeout occurs during the industry's initial growth phase to make it more efficient without sacrificing growth. Subsequent attempts to make the industry more efficient come from four key sources or events: the creation of standards, the development of an industry-wide cost structure as well as a shared infrastructure, government intervention, and industry consolidation through shakeouts. These four drivers force the industry as well as the players in it to become more and more efficient in order to stay competitive. As we will see in this chapter, they can occur at any time and in any order, sometimes independently, sometimes closely dependent on each other. Their primary effect, however, is to promote efficiency and fair competition within an industry such that no one company becomes a monopoly.
In subsequent shakeouts, the industry is reorganized for growth, typically through market expansion, including globalization. Driven primarily by investor demands, companies at this stage are concerned with growth of all kinds: revenue growth, cash flow growth, earnings growth, growth in the number of customers and revenue per customer, and growth in market capitalization. To continue to attract investment capital and growth, the industry needs to make productive use of all inputs, including capital, labor, and management talent.
The Creation of Standards
Market inefficiency can hasten the creation of de facto standards. Henry Ford paved the way for one such standard when he devised the highly efficient assembly-line manufacturing process for the Model T. Bill Gates was fortunate indeed when Microsoft received the nod from IBM and others to make the MS-DOS operating system the standard for personal computers. Once that standard was set, even Big Blue, known primarily for its hardware, could not wrest away control with its proprietary OS/2 system.
When standards play a major role and remain largely proprietary, there may not be room for three separate platforms. Typically at most two platforms can survive in the broad market: VHS & Beta for video recorders, VHS-C and 8mm for camcorders, PAL and NTSC for television broadcasts, CDMA and GSM for wireless telephony, PC and Mac for personal computing. Eventually, one platform becomes dominant, if not universal. Thus, 8mm has a big lead over VHS-C, PCs have triumphed over the Mac, and VHS has overwhelmed Beta. The other platform, if it survives, is relegated to a niche market.
The simultaneous existence of two or more standards, as in the case of NTSC and PAL, can be attributed in large part to protectionist ideologies and government regulation. Thanks to a double standard in the worldwide electric industry, tourists must contend with shifting between 110 volts and 220 volts, not to mention remembering to pack a variety of prongs and socket styles; in Europe alone there are some 20 different types of electrical plugs currently in use. To the delight of those tourists, these types of essentially meaningless and highly inefficient differences will start to go away as the electric industry adopts universal standards and the world at large becomes more driven by market economies. The cost of converting to a new single standard, however, is estimated to be $125 billion!
Already we can see the power of a fully adopted worldwide standard in the World Wide Web. The extraordinarily rapid diffusion of this technology across the globe has resulted in large measure because of that single standard. Emerging industries today are highly cognizant of this fact, and organizations that set industry standards now occupy an influential place in the world economy. The impact of evolving standards is illustrated by the stories of the evolution of the VCR industry and the development in Europe of the Group Special Mobile (GSM) network.
The VCR Industry
Based in Redwood City, California, the Ampex Corporation invented video tape recorder (VTR) technology in 1956. It sold machines to professionals initially for $75,000, but it was never successful in creating a product for ordinary consumers. However, it was successful in licensing its technology to Sony, which turned it into a competitive advantage.
Sony first introduced videocassette recorders (VCR) to the mass market in 1971, but even its "U-Matic" machines and cassettes were too big and expensive. Accordingly, Sony made modifications and repositioned the machines for industrial users. Next, Sony approached JVC and Matsushita -- two of its biggest competitors -- about establishing a standard (based on a new Sony technology) that would reduce the size of both machines and cassettes. JVC and Matsushita would accept only the U-Matic format, and JVC refused to cooperate or compromise on technology for smaller machines.
In 1971, JVC established a video home system (VHS) project team and charged it with the mission to develop a viable VCR for consumers, not just one that was technologically possible, but something consumers would prefer. Experimenting with ten different ways of building a home VCR, Sony settled by mid-1974 on the Betamax prototype. It set up a new plant to produce 10,000 units a month, but designed the machine to record for only one hour, reasoning that customers would use it to record television programs for later viewing. Later, when Sony asked Matsushita and JVC to adopt the Beta format, both refused, citing the one-hour recording limit as a major drawback. JVC's VHS format, then in development, would deliver up to three hours. After the Betamax was launched, Hitachi tried unsuccessfully to license Betamax technology from Sony, which basically had decided to go it alone.
Meanwhile, JVC formed an alliance of companies around the VHS standard before it shipped any products. The group included Matsushita, Hitachi, Mitsubishi, Sharp, Sanyo, and Toshiba. The standards war was on, and not even the intervention of Japan's Ministry of International Trade and Industry (MITI) in 1976 could succeed in resolving the dispute.
After JVC's launch in October 1976, Sony recruited Sanyo and Toshiba to join the Beta group. The split between the two formats continued for another ten years. Sony did well at first, in part because of its wide distribution. In 1976 and 1977, its market share was over 50 percent, but the company lost ground quickly. By late 1978, Matsushita with 35.8 percent of the market overtook Sony, whose share had slipped to 27.9 percent. By 1988, VHS had close to 95 percent of world sales. In a show of pragmatism, Sony launched its own line of VHS machines and repositioned Betamax as a high-end system for professionals.
Group Special Mobile (GSM) Network
The development of the Group Special Mobile (GSM) network has been an essential element in the success of European wireless companies such as Finland's Nokia and Sweden's Ericsson. Analog cellular telephone systems grew rapidly in Europe in the 1980s, especially in Scandinavia and the United Kingdom. Because each country developed its own sophisticated systems and networks, the industry was characterized by incompatible equipment and operations. Since mobile phones could operate only within national boundaries, the limited market for each company's equipment meant that economies of scale were poor. It was not unusual to see executives toting multiple phones depending on the country in which they happened to be conducting business at the time. The imminent creation of the European Union (EU) made this highly inefficient situation untenable.
In 1982, Nordic Telecom and Netherlands PPT proposed to the Conference of European Posts and Telegraphs (CEPT) that a new digital cellular standard be developed that would improve efficiency and help the industry cope with the explosion of demand across all of Europe. The CEPT established a body known as Group Special Mobile to develop the system. Members of the European Union were instructed to reserve frequencies in the 900MHz band for GSM to enable easy "roaming" between countries. In 1989, the European Telecommunications Standards Institute (ETSI) offered GSM as an international digital cellular telephony standard.
GSM service commenced in mid-1991. By 1993, there were 36 GSM networks in 22 countries. GSM was successful in gaining acceptance in non-European markets as well, since it was the most mature mobile digital technology. GSM also proved very successful in Asia, with its huge untapped markets that had no analog legacy to overcome. By 1997, over 200 GSM networks were running in 110 countries, with more than 55 million subscribers. As of January 2001, 392 GSM networks were operational in 162 countries, with dozens more planned. GSM had 457 million subscribers (up from 162 million a year and a half earlier) out of 647 million digital subscribers; another 68 million subscribers continued on analog systems.
The biggest holdout has been the United States, where the government has played no role in selecting a standard, and where a major rival to GSM, CDMA, has won many converts. Overall, the U.S. market is split among three standards: CDMA, GSM, and TDMA (a standard similar to GSM, but incompatible with it). By September 2000, CDMA had 71 million subscribers, whereas TDMA claimed 53.5 million. Each network operates independently of the others.
While many technology experts argue that CDMA is a superior technology, the advantage appears to be with Europe at this point. Simply put, GSM phones are much more usable worldwide. This wider usage base has allowed Europe to move ahead in phone functionality. Nokia is leading the charge, pioneering Internet access on cell phones. Through infrared technology, phones can transmit data to each other or to a machine; in Finland, this technology can be used to purchase a Coke from a soda machine. CDMA's acknowledged technological superiority is similar to that enjoyed by Betamax and the Macintosh. As history has taught us, neither was able to prevail.
The battle between CDMA and GSM may well be settled as we move to the next generation of wireless technology: so-called 3G or third-generation wireless systems featuring very high data transmission rates that will allow for two-way video communication. It is expected that most mobile operators will converge on a single worldwide standard for 3G systems.
Industry Cost Structure and Shared Infrastructure
The prevailing cost structure in an industry -- those costs primarily related to production, and to some extent to management and marketing -- has a deep impact on whether and how soon that industry becomes organized. This impact can be measured in terms of the relative significance of the industry's fixed costs versus its variable costs. As an industry emphasizes automation, incorporates new technology, and tries to mitigate the high or growing cost of human capital, it tends to increase fixed costs and lower variable ones.
Participation in an industry always has certain requisite fixed costs. In all aspects of business -- from procurement to operations to marketing -- relative market share determines spending efficiency. Thus, when it comes to national advertising and sales, for example, a company that has a 40 percent share of the market is potentially four or more times more efficient than a company with a 10 percent share. These are examples of fixed costs; that is, a company incurs them regardless of how high or low its sales are. Once a company has made the decision to target a particular market, it has to pay the piper no matter how great or small revenues promise to be. As we have observed, those industries in which such fixed costs tend to dominate are more likely to exhibit a pronounced Rule of Three structure.
If the costs to participate are high, the "minimum efficient scale" needed to attain efficiency in operations is also high. As a result, the shakeout in the industry happens sooner rather than later. In contrast, markets in the so-called agricultural age were characterized by near perfect competition: many small producers and buyers interacted in the marketplace, where prices were set according to the relative balance between supply and demand. The agricultural sector has predominantly variable costs: the costs of seed, fertilizer, and labor can fluctuate depending on conditions, but are always linked to the volume of production. About the only fixed cost is the cost of land, which is typically inherited in many countries. During the agricultural age shakeouts were kept to a minimum. As the farms have become commercialized, we see an economy of scale developing and the exit of family-owned businesses and farms.
Cost structure also makes its impact felt through the supply function. If the supplier industries enjoy significant economies of scale because of their cost structure, downstream industries also feel the pressure to consolidate, even though their own cost structure may not require or adequately support such a move. The two major suppliers to the personal computer industry -- Microsoft and Intel -- are dominant in their respective spaces, for example. Despite the lower entry and exit barriers associated with PC assembly, this dominance still creates pressures for concentration downstream.
Likewise, a high concentration of customers puts additional pressure on the industry to consolidate. In the industry comprising defense contractors, where the U.S. Department of Defense is by far the overwhelming customer, the number of defense contractors has fallen steeply in recent years. Also, a substitute industry that has a higher fixed-cost component will enjoy a price advantage. This too creates pressures on an industry to consolidate and become more fixed-cost intensive.
Although many people assume that fixed costs are bad for business, this is not necessarily the case. As the primary source of scale economies, fixed costs are an essential element in the competitive strategy for volume-driven players such as full-line generalists.
A Shared Infrastructure
In addition to fixed and variable costs that individual players in a market must consider, the market as a whole can move toward greater organization by developing a shared infrastructure for the purpose of increasing efficiency. Infrastructure costs are generally too high to be loaded on the transactions generated by any one company. Banks, for example, would be unable to survive if they did not share an infrastructure for check clearing, as well as for credit card authorization (through the Visa and MasterCard systems). Similarly, airlines require shared infrastructures for reservations, air traffic control, baggage handling, and ground services. Fundamentally such an infrastructure distributes the heavy cost of implementation, thereby making the system more affordable for all the players, large and small, in the industry.
To be useful in enhancing efficiency, an industry infrastructure must be:
- Sharable: it must allow for simultaneous access by many users.
- Ubiquitous: it needs to be where you want it, when you want it.
- Easy to use: it must be intuitive and require little or no training to use effectively.
- Cost effective: it must be accessible and affordable to all.
By far the most significant recent example of a shared infrastructure is the Internet. Regarded as the most significant invention of our time, the Internet has become a major new infrastructure for virtually all businesses of any size, whether new or old. From an obscure tool used by researchers and academics at government-funded laboratories and universities, the Internet has exploded into the world of commerce. The starting point was a simple but brilliant innovation: the World Wide Web (see sidebar "Berners-Lee and the World Wide Web"). By general agreement it is comparable in its impact to the invention of movable type by Johann Gutenberg almost 600 years ago.
Berners-Lee and the World Wide Web
Tim Berners-Lee worked as a computer scientist at CERN, the international particle physics lab in Switzerland. It was his innovative idea that became the basis for the World Wide Web. In 1973, Vint Cerf and Bob Kahn had devised the Internet feature called Transmission Control Protocol / Internet Protocol (TCP/IP), which has been described as "one of the great technological breakthroughs of the twentieth century."
Berners-Lee came up with two simple innovations that enable people to navigate between previously unrelated sources or Web sites. Building on the Internet technology, he created a global hypertext system by inserting links from one text to another. He named one of his innovations the Hypertext Transport Protocol, now better known to Web surfers in its abbreviated form, http. In addition, he devised a way of identifying a document using the Uniform Resource Locator, or URL. Today these are common terms used in Internet traffic, although the public may not know their full names and functions.
Created in 1989, the Web is arguably an essential element of the infrastructure, not just for business and commerce, but also for governments, personal communications, community formation, and entertainment. As with the ideal infrastructure, it is not controlled by any one commercial entity, but evolves through the collective efforts of many. Forums of engineers, such as the World Wide Web Consortium, ensure that it functions well and evolves as needed. No company can unilaterally dictate that new features be added; nevertheless, standards are set faster than ever and are completely open. Because of this openness and malleability, the Internet has led to innovations at an incredible pace. MP3 is today's standard for compressing music files. The Java programming language has a place in practically all Web sites. Numerous other examples -- digital subscriber lines (DSL), broadband, electronic mail, teleconferencing, and the like -- indicate how fast this industry is moving in supplying products and services to individuals the world over.
Igniting one of the greatest explosions of wealth in history, the Web has also transformed the business community. The transformation has been both internal and external. Intranets, for example, have streamlined internal operating processes. Through extranets companies have developed closer linkages with their suppliers, alliance partners, and customers. The Web has fueled the growth of categories of commerce such as person-to-business and person-to-person.
Government Intervention
So far, most governments have resisted the temptation to try to control the Internet or regulate its functions. At the urging of their constituents, government officials have preferred to adopt a hands-off approach. Nevertheless, the government can and often does play an important role in determining an industry's structure, including triggering major consolidation. Often the government itself is a major customer -- the Department of Defense exemplifies a customer with deep pockets. The significance of the government's role as a buyer is even more pronounced in Europe than it is in the United States.
A major funder of research and development as well as a major buyer, the federal government has a significant impact on the pace and direction of technological change in many industries. In some cases, the government also facilitates cooperation within an industry, especially at the "pre-competitive" stage. Japan's Ministry of International Trade and Industry (MITI) has been the most prominent example of this kind of facilitator, although governments in Europe and the United States have participated in similar cooperative efforts.
For other industries -- for example, education, health care services, and computers -- the government helps to move the industry toward standardized products and processes. The government may intervene, for instance, if it sees that an important market is failing to achieve efficiency on its own. When too many companies were laying cable in the telephone and communications industry, each hoping to gain monopoly power by establishing itself as the leader with proprietary products, the U.S. government intervened by creating standards or sanctioning "natural monopolies" to generate efficiency. A similar intervention in the U.S. railroad industry established a much-needed standard for operations and had immediate effects on the players' profitability.
The Railroad Industry
In the middle of the nineteenth century, the railroad industry took off in the United States. Long before anyone had an inkling of the automobile industry, people saw a "natural" fit between the railroads and the physical size of the country with its vast stretches of undeveloped land. The railroads, however, developed haphazardly, primarily because the industry was so fragmented with many small, inefficient players and because there were no uniform standards. The most telling omission was that the U.S. railroad industry lacked a uniform gauge (the distance between the tracks). Goods had to be transferred between railroad carriers at points where rail lines of different gauges intersected -- a highly expensive and inefficient procedure.
The U.S. government, understandably, was concerned with the speedy construction of the railroad system. In the 1850s, federal, state, and local governments stimulated the growth of the industry, granting charters (or in some cases actually building the lines), as well as providing money and credit for many private railroads. The federal government conducted surveys at taxpayer expense and reduced the tariff on iron used by the railroads. Before 1860, the government provided almost 25 million acres of land for railroad construction, with two main stipulations: (1) the railroads would transport government property and Union troops for free, and (2) Congress would set rates for mail traffic. The federal land grant program expanded rapidly after the Civil War ended in 1865.
Battles and explosions during the war significantly damaged the railway system, destroying miles of track and rendering equipment unusable. After the war ended, government officials and industry executives wisely undertook a rehabilitation program that at last specified a standard gauge of 4 feet 81?2 inches for all tracks. By 1880, 80 percent of the mileage had been converted to this standard. By 1890, virtually the entire network was brought into compliance with the new standard, thereby insuring that the railroad industry became both more efficient and extended its reach to more remote regions. Now that everybody was running on the same track, the railroad companies themselves became much more serious targets for mergers and acquisitions. Accordingly, the industry rapidly became more concentrated.
The railroads increased their hold on power, such that demands for the regulation of the industry grew loud and urgent. In 1887, the federal government passed the Interstate Commerce Act, creating the Interstate Commerce Commission (ICC), which became a major force in the development of a federal regulatory policy.
The rail industry peaked in 1920; after that date, other modes of transportation -- particularly the automobile and the airplane -- reduced the importance of the rail system. In the 1920s severe competition from outside the industry caused many passenger railroads in their prime to cease operations. Only Amtrak would be reborn some 40 years later, and only then because of massive tax subsidization.
Industry Consolidation
Over the last several years, we have witnessed a record number of mergers, as well as numerous demergers (the spinning out of noncore businesses). As a result, the landscape of just about every major industry has changed in a significant way. The pace of this consolidation is startling: the number of mergers per year in the United States has more than tripled over the past decade, while the value of those mergers has risen tenfold. Between 1997 and the end of 2000, nearly $5 trillion in mergers took place in the United States alone. The most recent large mergers and acquisitions have occurred in the telecommunications, banking, entertainment, and food industries, as indicated in the accompanying tables.
While the United States has been at the forefront of this trend, M&A activity has been feverish on the global level as well. At the time, few experts believed that 1998's record of $2.52 trillion in global M&A activity would be soon broken; however, total worldwide transactions announced in 1999 reached $3.43 trillion, exceeding the previous record by an astounding 36 percent. In 2000, the total reached $3.5 trillion, growing only slightly over 1999 activity. The uncertain market environment in late 2000 and early 2001 has dampened merger activity worldwide; however, we expect that it will rebound as markets recover. Appendix 1 presents an encapsulated history of merger activity in the United States during the twentieth century.
Europe has been a particularly fertile area for some of these recent megadeals, particularly in telecommunications, utilities, banking, and the retail sector. M&A activity in Europe more than doubled in 1999, totaling $1.2 trillion. This total includes United Kingdom-based Vodafone Airtouch's $203 billion offer for Germany's Mannesmann AG, the largest deal ever. France's two largest retailers and hypermarkets, Carrefour SA and Promodes, merged to form a $52 billion giant, now the world's largest retailer after Wal-Mart. The globalization of retailing, long believed to be an industry unlikely to globalize, appears to be well underway; Carrefour and Promodes are already prominent across Europe as well as in Latin America. Likewise, Arkansas-based Wal-Mart has been expanding south into Latin America as well as east into Europe.
Even Japan, a nation for years thought to be an uncongenial place for mergers, is experiencing a much accelerated pace of M&A activity. Because Japanese markets and culture did not generally support mergers, most of the country's industries experienced a lot of fragmentation. In the past Japan's extremely low cost of capital and its cozy keiretsu relationships have contributed to keeping an excessive number of full-line generalists afloat. A proliferation of major players is evident in most industries: for example, seven major camera makers (Canon, Nikon, Asahi Pentax, Minolta, Yashica, Fuji, and Konica); seven big car companies (Toyota, Nissan, Honda, Mazda, Mitsubishi, Subaru, and Isuzu); and several consumer electronics companies (Sony, Matsushita, Hitachi, Mitsubishi, and Toshiba). Gradually, however, merger activity has been on the increase. In 1999, M&A volume in Japan tripled over 1998 levels, though still amounting to only $78 billion. Mergers in Japan are starting to focus on industry consolidation and the "unbundling" of conglomerates.
As more industries globalize, a larger percentage of mergers involve firms from different countries. Such cross-border M&A activity has risen fivefold over the past decade. In terms of total value cross-border mergers reached $720 billion in 1999. As a share of world GDP, they increased from 0.5 percent in 1987 to 2 percent in 1999. Industries that previously could not expand in such a manner for operational reasons are now able to do so. Retailers, for example, can use new technologies to manage cross-border supply chains and centralized purchasing for multiple countries.
Recently NationsBank completed its merger with BankAmerica in a $60 billion stock deal. SBC Communications acquired Ameritech for $62 billion in stock. British and Swedish drug groups Zeneca Group plc and Astra AB announced plans to join forces in what was until then Europe's largest merger, following Hoechst and Rhone Poulenc's merger of their life science units to form Aventis, and an all-French merger between Sanofi and Synthelabo. Ciba Specialty Chemicals and Clariant, two of the largest players in the rapidly growing specialty chemical industry, are merging. Exxon and Mobil combined to form the world's largest oil company, fast on the heels of the merger between BP and Amoco (Royal Dutch/Shell rounds out the major players in that industry). The European banking sector, following economic and monetary union, is rapidly consolidating across national boundaries. French banks Société Générale and Paribas have announced plans to combine to form Europe's second biggest bank, behind the Deutsche Bank/Bankers Trust merger of 1999 and ahead of Switzerland's UBS AG.
Clearly we are witnessing a reorganization of the patterns of corporate ownership, as well as the risks involved in business participation -- namely, those businesses a company should enter as opposed to the ones it should exit. The current wave of mergers and de-mergers represents a historic rationalization of "who does what and for whom." In general, the result is improved market efficiency, lower prices for customers, and higher returns for investors.
Industries tend to become more efficient as they undergo consolidation. In a highly fragmented market, especially one in which growth has begun to slow, numerous small, inefficient players recognize that it is to their advantage to join together or combine with larger companies that can command greater economies of scale and scope. The drive for efficiency transforms an unorganized market with myriad players into an organized one in which the number of players rapidly drops. By acquiring small companies (as General Motors did in the automobile industry) or by creating a de facto standard (as Ford did in the assembly-line process of building the Model T), one player makes the turn and becomes a broad-based supplier. From this point in the market's evolution, the Rule of Three comes into play. In most cases, two additional players are also able to evolve into full-line generalists.
The Software Industry
The personal computer software industry started up in the early 1980s. At the outset, there were hundreds of small, mostly anonymous firms vying for position. Essentially a cottage industry, software was primarily a technology business, and scale was not much of a factor. In a fateful decision, IBM selected Microsoft to provide the DOS operating system for its personal computers, thereby giving Gates's company the enormous advantage of owning the dominant standard. Over the next several years, three other companies emerged as significant players, each as a product specialist: Lotus, which had acquired spreadsheet technology from VisiCalc; WordPerfect, which fast became synonymous with word processing, and Novell, which staked out an early position in the networking arena.
Microsoft gradually leveraged its extraordinary advantage in operating system software to establish a commanding position in applications. Although it was initially unable to challenge Novell in networking with LAN Manager, it developed competitive products in word processing (Word) and spreadsheets (Excel). Microsoft was the first to sell software applications in bundled form, inventing the concept of a "suite" of applications that shared some features and allowed information to be readily transferred and accessed across them.
Microsoft thus became the first full-line generalist in the market, setting in motion an inevitable restructuring of the entire industry. Lotus, for example, soon realized that if it wanted to continue to grow, it had to reduce its overwhelming dependence on a single product (Lotus 1-2-3) and broaden its product line. By acquiring the word processor Ami Pro and developing the presentation graphic package Freelance, Lotus became the industry's second full-line generalist. WordPerfect was even more dependent on its namesake word processor than Lotus had been on its spreadsheet; it tried, but failed, to develop a viable full line of products on its own, including PlanPerfect and WordPerfect Presentations. Finally, it was forced to merge with Novell. Even so, the duo had to acquire Borland's Quattro Pro spreadsheet to complete their package of offerings.
Over time, the Big 3 added database, electronic mail, and many other categories of software to their lines. Although the market still included hundreds of specialists, they essentially ceded the large applications -- word processing, spreadsheets, presentation graphics, databases, networking, and electronic mail -- to the Big 3. Gradually, however, Microsoft's dominance in operating systems, superior marketing, and overwhelming financial advantage increased its dominance in the office suite domain to well over 90 percent of the market. It thus left its two main competitors with a share of less than 10 percent of the market to divide between them, in effect forcing both of them into the ditch.
To be sure, the poor execution of its competitors helped Microsoft achieve this high level of success. WordPerfect, for example, could have leveraged a major asset -- its enormous number of devoted customers -- to expand its product offerings in the word processing market. Instead, the company made a classic mistake: it failed to develop a version of its program for the Windows operating system until two years after Microsoft had delivered Word for Windows. By then, it was so far behind that it could never catch up. When WordPerfect later created a suite of its own by coupling its word processor with Borland's spreadsheet, the applications lacked common controls and made little headway against Microsoft's smoothly integrated products.
The U.S. Airline Industry
After World War I, several European aviation companies hired wartime pilots to fly decommissioned warplanes along the first commercial air routes. Aided by heavy subsidies from European governments, a number of well-known commercial airlines such as British Airways, Air France, and KLM began operations during the 1920s.
In the United States, airlines emerged primarily as a result of the U.S. Post Office's attempts in 1919 to establish a nationwide airmail service. In fact, the Post Office played a leading role in setting up the system of airports across the nation. In 1925, Congress passed the Air Mail (Kelly) Act, authorizing the postmaster general to use private contractors to provide airmail service. The creation of a number of private air transport companies was not far behind, some of which began carrying human beings as well as the mail.
In response to this increased activity, Congress passed the Air Commerce Act of 1926 and instructed the secretary of commerce to "foster air commerce, designate and establish airways, operate and maintain aids to air navigation, license pilots and aircraft, and investigate accidents." As a whole, however, the American public was too enamored of the automobile and the Roaring Twenties to take much interest in flying. Then in 1927 Charles Lindbergh captured headlines in his solo transatlantic flight to Paris. Suddenly air travel became the rage, and new companies seemingly sprang up overnight: Pan Am and TWA were both founded in 1928; Delta followed in 1929; American Airlines was formed in 1930 out of the combination of many small mail carriers; and United Airlines was created in a merger of several older mail carrying operations in 1931.
Boeing and Lockheed introduced the first planes specifically designed for passenger service. Douglas Aircraft dominated the skies with its DC-3s, DC-4s, and DC-6s, but in 1957, Boeing beat Douglas in building the first commercial jetliner. For a time the launch of larger aircraft lowered the cost of air travel. The number of passengers grew from merely a few thousand in 1930 to about 2 million in 1939. By the end of the 1940s, the number of air passengers topped 16.7 million.
Regulating this new industry, the Civil Aeronautics Board (CAB) was authorized by the Federal Aviation Act of 1958 to establish routes, fares, and safety standards. In addition, the CAB heard complaints from the traveling public and settled disputes with the airlines. Dissolved in 1984 as part of the government-directed deregulation of the airline industry, the CAB in effect gave up its responsibilities to the Federal Aviation Administration (FAA), which was entrusted with overseeing the air traffic control system, certifying pilots, and establishing standard safety precautions for the industry.
Deregulation allowed the airline companies to set their own routes and, after 1982, their own fares. When the competitive forces were at last unleashed, the industry experienced rapid changes, fare wars, new incentive plans to placate employees, and innovative promotions to attract customers. Many new airlines were spawned in the deregulated industry, increasing from 36 in 1978 to 96 in 1983, most of them serving rather localized geographical niches. Between 1980 and 1983, as companies tried to compete on low prices and waged fare wars even when new competitors were flying into the market, the industry suffered losses of $1.2 billion.
American Airlines introduced its AAdvantage frequent-flier program in 1981. Lower fares and heightened competitive activity in the 1980s led to rapid industry growth in terms of customers served: an increase from 297 million passengers in 1980 to over 455 million in 1988. A decade later, that number rose to a record 551 million passengers.
The financial problems that many airlines faced led to increased labor strife, bankruptcies, and for some carriers the prospect of being acquired. Delta bought Northeast; Pan American took over National; TWA acquired Ozark Airlines in 1986; Northwest gobbled up Republican; US Airways pocketed Pacific Southwest. Texas Air/Continental acquired People Express and Eastern Airlines, which shut down entirely in January 1991 after having operated two years under Chapter 11 bankruptcy provisions. In 1987, Delta bought Western Airlines. In 1989, US Airways acquired Piedmont. Continental, America West, and Pan American entered Chapter 11 in 1990 and 1991, but only the first two emerged to resume full operations. For some time TWA managed to keep body and soul together, but was eventually acquired by American Airlines in March 2001.
Consolidation of the airline industry continues both in the United States and in Europe. The current Big 6 in the United States appear close to becoming the Big 3, dividing up nearly 85 percent of the domestic market. This consolidation will happen primarily through the mergers of several ditch airlines with one of the current market leaders, United, American, and Delta. The first salvos have already been fired: in addition to American's acquisition of TWA, UAL Corporation, the parent of United Airlines, announced in May 2000 its intention to buy US Airways Group. Meanwhile, Delta and Continental are in discussions to merge. With the advent of truly "open skies," the global consolidation of this industry is not far off.
The Pharmaceutical Industry
In the $300 billion global pharmaceutical industry, approximately 100 firms struggle for survival. The world leaders in drug discovery, U.S. pharmaceutical companies currently develop about half of all new medicines, accounting for about 40 percent of the market. European giants round out the top ten firms. But major changes now occurring in the industry illustrate the enormous effects of the four mechanisms discussed in this chapter. Consolidation over the past 15 years has whittled the more than two dozen multinationals down to about 15. Companies are exiting non-health-care businesses, increasing spending on research and development, acquiring or partnering with genomic and drug discovery companies, growing their sales forces, and increasing advertising expenditures.
Which firms will be the victors? Which will be driven from the market? Currently, all major pharmaceutical companies are in or near the ditch. The largest, Merck, commands a meager 10.9 percent of the market. Growth in the industry is a direct result of new products (innovation). New drugs, however, do not come without the high risk and price of R&D. An average of twenty cents of every dollar of revenue is reinvested in R&D, but only one out of every 250 drugs that enter preclinical testing ever makes it through the approval process. The average time-to-market is 12 years, an eternity in any industry. Only a third of the approved drugs recover the cost of their research and development. When the cost of failures is amortized over that of those few successes, the estimate for bringing a new drug to market amounts to $500 million. Despite such obstacles, the demand for new drugs keeps rising.
Three key factors in today's marketplace are creating demand. First, customers -- particularly those over age 65, a group that consumes three times as many drugs as those under 65 -- eagerly await new product releases. It is no surprise that in the past decade over 150 new medications have targeted diseases of the elderly, and currently there are more than 600 drugs in R&D aimed at seniors. Yet according to the World Health Organization, three-quarters of the 2,500 currently recognized medical conditions lack adequate therapies. With the rapid increase in the world's senior population, the demand for pharmaceutical products for society's aging will continue to rise at staggering rates.
Second, enrollment in plans such as health maintenance organizations (HMOs) and preferred provider organizations (PPOs) has swelled in the past twenty years. Managed care, which in the 1980s had approximately a 30 percent share of the pharmaceutical market, now covers 83 percent of private-sector employees. The share of market will soon reach an estimated 90 percent. It is now widely accepted that effective self-care is much more cost-efficient than treatments requiring hospitalization or surgery. Given that the leading-edge, branded drugs are fundamental in effective self-care, it is understandable that the pharmaceuticals are interested in responding to the increase in demand.
Third, since the Food and Drug Administration (FDA) has relaxed restrictions on direct-to-consumer advertisement over the past three years, advertising for drug products has surged. For instance, in 1998 Schering-Plough spent $200 million advertising the allergy pill Claritin to consumers. In 1999, pharmaceutical companies spent $1.8 billion on advertising to consumers with $1.1 billion of that going towards TV ads -- a 40 percent increase over 1998 ad budgets. The result of these campaigns is increased diagnosis and treatment (with drugs) of many unreported diseases and ailments. In fact, heavily advertised products enjoy an average increase in sales of 43 percent compared to 13 percent for those products not heavily advertised.
The Rule of Three identifies four key processes by which growing markets become efficient: creation of standards, shared infrastructure, government intervention, and consolidation. In the pharmaceutical industry all of these four processes are in play. In the United States, the FDA sets stringent standards for product safety and efficacy of drug products. Other countries and unions of countries have similar organizations. Shared infrastructure is provided by government and privately funded research organizations from around the world. The National Institutes of Health (NIH), for instance, furnishes basic scientific research to industry in the United States. Government intervention provides a level of protection for discoveries, unique processes, and intellectual property through patent laws. By allowing a short-term monopoly on a product, the innovative firm can recoup exorbitant R&D expenses.
Finally, the industry is consolidating as the Rule of Three predicts. In the past decade, there have been 27 consolidations of significant pharmaceutical companies and numerous consolidations of smaller firms. Acquisitions and alliances between big pharmaceutical and biotechnology companies have also taken place.
A leading cause of this industry consolidation is shareholder demand for high return in exchange for high risk. In evaluating a possible merger, firms look for synergies such as those that brought Pharmacia and Upjohn together in 1995. Pharmacia had many drugs in its pipeline but was weak in U.S. marketing, whereas Upjohn was just the opposite. The merger of the two companies produced a single entity with a pipeline full of products, a strong U.S. marketing presence, and $1 billion available for R&D.
Many pharmaceutical companies look to acquire competitors who have core competencies that differ from their own. Technologies such as drug delivery, drug discovery, and genomics characterize biotechnology companies but are lacking in most big pharmaceutical companies. Allowing for less expensive and more rapid development of novel therapies, these technologies complement the pharmaceutical industry's core competencies. The top 20 pharmaceutical companies combined have alliances with over 1,000 biotechnology companies.
Competitive pricing is another driver of industry consolidation. In most European countries and Japan, governments have strict pricing controls, profit controls, and prescribing controls. Such policies greatly reduce revenue and profitability for pharmaceutical companies. The United States is the only major market where pharmaceuticals are not yet restricted by government policies, but there are other agents of price controls at work. Contract purchasing by HMOs and prescription benefits managers (PBMs) has brought competition based on pricing to a new high. Replacing physicians as the gatekeepers for prescription drug allocation, powerful buying groups now demand lower prices and greater use of generic drugs. They generate preferred drug lists, or formularies, to which patients' benefits are directly linked, and they dictate which drugs can fill clients' prescriptions. The drug companies must acquiesce if they want their products to be included on the preferred drug list, even if they are sold at reduced prices.
Reduced prices, of course, reduce earnings. In response, industry consolidation can provide broader product lines and economies of scale, thus empowering the drug manufacturers in negotiations with buying groups and government agencies.
Price wars are fought on three fronts: between brand name drugs and generics, between brand name drugs in the same therapeutic category, and between comparable generics. In the United States, changes in regulatory policies have increased competition in all three areas. Each of these price wars contributes to further industry consolidation. Market share for generic drugs rose from 18 percent in 1984 to 47 percent in 1999 -- a rise attributed to both the purchasing power of managed care and the 1984 Hatch-Waxman Act, which abbreviates the FDA approval process for generic drugs and allows manufacturers of generic drugs to conduct their testing prior to the expiration of the brand name drug's patent. This provision has reduced barriers to market entry by lowering the cost of clinical testing and accelerating the time-to-market from the previous industry standard of three years to three months. Although patent protection is initially issued for 20 years on new brand name drugs, most new drugs are patented early in the development and approval stages. Thus, when a new drug finally enters the market, only 11 years of patent protection, on average, remain. Once a generic drug is available, sales of the brand name drug drop typically by 60 percent.
Competition between similar brand name drugs has become much more fierce because rival pharmaceutical companies have adopted fast-follower strategies. With the recent advances in information technology and drug discovery technology, the period between the introduction of a breakthrough drug and the fast follower brand name drug can be less than a year. For example, Celebrex, an arthritis medication from Pfizer, was approved on December 31, 1998. It had less than five months of true market exclusivity before Vioxx, a similar arthritis drug from Merck, entered the market on May 21, 1999. This fast competition plays a significant role in keeping prices and earnings down.
In addition, generic drugs compete with each other on price. As numerous generic products become available for a particular drug, prices are driven down. Although there are a host of independent generic drug companies, it is important to note that many of the largest pharmaceutical companies own generic subsidiaries or divisions. For example, Novartis owns Geneva Pharmaceuticals.
Other pharmaceutical companies find success in the marketplace by carving out their own niches. Certain specialties, such as cancer therapies, will most likely provide a major niche in the industry. Because cancer drugs do not require as much sales and marketing effort as other classes of drugs, companies currently specializing in these treatments or other niche market segments are more likely to remain independent.
While coalescing into a handful of large players, the pharmaceutical industry has also been exiting non-core businesses. Novartis's crop-protection and seed businesses, for example, were spun off and merged with AstraZeneca's agrochemicals business to create Syngenta AG. In early 2000, Abbott Laboratories sold its agricultural products business unit to Sumitomo Chemical Company of Japan. In June 2000, American Home Products completed the sale of Cyanamid, its agricultural business, to BASF AG.
In spite of its being a patent-based industry, which is typically not susceptible to the Rule of Three, the U.S. pharmaceutical industry actually does not present an exception to the evolution of competitive markets that we see occurring in other industries. Patent protection does not provide an impenetrable shield against competition; growth and efficiency factors will lead to further consolidation. From the current leaders in this industry, a Big 3 will eventually emerge, but not until after a big shakeout reduces the number of legitimate players.
The Dynamics of Industry Shakeouts
As we have noted above in discussions of the agricultural age, industries in which there is "perfect" competition typically do not have to go through a major shakeout. Similarly, personal care and consumer service industries such as beauty shops, plumbing and heating companies, and repair shops have not experienced significant shakeouts. These industries are characterized by a lot of individualized attention, with a high degree of manual labor. In addition, their operations are not scalable. Where consolidation is such a major driver of industry organization, however, such events can have devastating as well as beneficial effects. It is important, therefore, to look at the causes and effects of industry shakeouts in more detail.
In recent years, few industries have escaped the destruction and turmoil resulting from shakeouts. Over the past two decades, victims of shakeouts (or their beneficiaries, depending on one's point of view) include airlines, automotive component producers, banks, biotechnology companies, boat builders, cable TV operators, construction contractors, defense contractors, department stores, health maintenance organizations (HMOs), hotels, minicomputer companies, newspapers, shopping malls, saving and loan companies, steel factories, trucking companies, makers of wine coolers, and wood-stove makers.
An industry is considered to have experienced a shakeout if 25 percent or more of its companies have disappeared within a short period. Like an earthquake, a shakeout brings major upheaval to an industry, changing its complexion, the mix of competitors, and the rules of competitive play. Also like earthquakes, shakeouts vary in their duration, intensity, causes, and effects.
- Duration: The length of shakeouts and the intensity of the resulting changes vary considerably, as the chart indicates. Some industries, including trucking, soft drinks, and electric housewares, experienced massive shakeouts that lasted a short period of time (one or two years). Other industries, such as mobile home manufacturing, oil drilling, microcomputers, smoke detectors, and telephone equipment manufacturing, have experienced longer, more protracted shakeouts that lasted several years.
- Intensity: If we judge the severity of a shakeout by the number of companies that exit the market, then some shakeouts have been very severe. For instance, the U.S. color TV industry lost over 90 percent of its producers in a single decade. Some shakeouts can be short in duration but severe in their devastation, as happened in the case of the TV-based home shopping industry that lost over 50 percent of its members within two years.
- Contributing causes and effects: If the shakeout is caused by a radical shift in consumer demand -- as happened when consumer preferences shifted away from electric typewriters to personal computers -- the results can be severe. Shakeouts, however, occur for many other reasons, including excess supply of goods and services, the overabundance of producers, radical shifts in societal, technological, and competitive conditions, and the emergence of innovative systems of management. Frequently, several factors combine to cause a shakeout. The more radical the shift, the more swift and devastating the ensuing shakeout, as happened after the deregulation of the airline and trucking industries.
When a shakeout occurs early in an industry's life cycle, as happened in the PC industry, it can indicate either the emergence of a dominant technological design or the existence of a "majority fallacy." In the first scenario, a major technological design is widely accepted by customers as the standard, but some companies do not or cannot adapt their manufacturing or marketing operations to match that design. Their inflexibility causes those companies to fail to meet customer expectations, leading ultimately to their exiting the industry. The shakeout rids the industry of weak competitors, giving survivors a bigger share of the growing industry.
In the second scenario, large numbers of individual entrepreneurs and established companies enter an industry, attracted primarily by its promise of quick growth and easy profits. They constitute a majority, persuaded that the industry has made its transition from startup to fast growth. Well-financed and managed, the early entrants proceed either to acquire the truly pioneering companies or to replace them. Because even a growing and prosperous industry cannot always fulfill the expectations of all entrants, many fail to achieve market success. As a result, a shakeout occurs, ridding the industry of weak new entrants and inflexible pioneers. The IBM-compatible personal computer market, which in the mid-1980s went through such a shakeout, serves as an example of this fallacy.
A shakeout that occurs later in an industry's life cycle manifests a different combination of powerful forces. It often signals the industry's movement to the maturity stage, where demand plateaus, as happened in the tire industry, for example. This shift forces some companies to exit the industry because they fail to achieve appropriate profits. Moreover, as industry maturity approaches, product substitutes multiply and cause further declines in demand for an individual company's products. Further, mature industries often invite entry by foreign companies, forcing existing companies to scale down their expectations, exit the industry, or attempt to become globalized themselves.
Shakeouts in a mature industry can also be caused by "de-maturing." A technologically adept competitor can open a new competitive front by deploying technologies from outside the base industry. Typically, firms use some form of information technology to transform their business, by infusing intelligence and other attributes into their products or dramatically altering their production and operating processes. For example, in the late 1980s, Yamaha revived a moribund piano industry by developing a digital piano that could play itself (using instructions stored on a floppy disk), teach a novice how to play, or serve as a traditional piano. As a result, Yamaha altered the needed core competencies in the industry, and many competitors that lacked the requisite technological capabilities were forced to exit.
Industries may experience more than one shakeout. Because of the vast number of producers, the U.S. automobile industry experienced its first shakeout in 1920 and 1921. Having barely recovered from this shakeout, the industry experienced a second massive shakeout in the early 1930s, triggered by the Great Depression. Still a third shakeout occurred in the 1940s as a result of the exorbitant costs associated with competition in a growing national market. As we will see in chapter 2, this shakeout led to the consolidation of the industry in which three firms came to dominate the industry's sales.
The auto industry is not alone in experiencing multiple shakeouts. The financial service industry has undergone similar changes. The early deregulation of the industry in the 1980s led to a massive shakeout. Near the end of that decade, however, global competitive forces and technological advances led to another serious shakeout.
Clearly, with rapid global and technological changes, executives can no longer accept the folk wisdom that shakeouts result just from industry maturity. Industries may experience multiple shakeouts at different points in their evolution. These shakeouts often require different strategies in order to ensure company survival.
On the one hand, forces of technology and globalization most often cause sudden changes in industry structure. These forces can be linked, as when technology diffusion across countries causes seismic changes in an industry. For example, the emergence of global players such as Airbus in the aerospace industry led to industry realignment and the exit of marginal players such as Lockheed Martin and then McDonnell Douglas from the commercial aviation business.
On the other hand, market-driven shakeouts (through mergers and acquisitions) and those induced through gradual regulatory relaxation tend to have an evolutionary impact. In the case of a historically heavily regulated industry such as telecommunications, a complex network of forces is coming into play. All four of the drivers of change discussed in this chapter are having significant effects. Changes in regulatory policies in numerous countries are rapidly creating a highly globalized industry. Technological changes are coming at a rapid pace, driven by the convergent power of digital electronics. Customers' needs are escalating. The combination of these forces is leading to a shakeout in the industry on a global basis.
Early Warning Signs for Shakeouts
Forward-looking firms can anticipate an impending shakeout in their industry by observing the leading indicators of major change. Some of these indicators are industry specific. In the personal computer software business, for example, the sales of software development kits for a new operating system provide a strong signal of coming shifts in the industry. For the majority of industries, we can identify a number of "generic" indicators of major change and possibly an industry shakeout:
- The expiration of crucial patents long held by market leaders, thereby setting the stage for a majority fallacy, followed by a shakeout.
- Sudden and large changes in trade barriers (such as the adoption of free trade agreements), leading to an unsustainable proliferation of rivals in a market.
- The rate of growth of new companies far in excess of the market growth over a lengthy period of time, such that most players cannot continue to operate profitably in the industry.
- Prolonged capacity underutilization in the industry, suggesting building pressures for voluntary or forced market exit.
- Technology breakthroughs both within an industry and in industries producing substitute products, as well as a flurry of new patent activity in a core technology for an industry.
- The sudden influx of overseas competitors into a market, resulting when key technologies become accessible to new entrants or when an industry has been targeted as "strategic" in another country.
- Shakeouts in upstream or downstream industries.
It does not take a prophet to recognize the signs of coming change, but the signals are many and varied, and they can be misinterpreted. Companies, like individuals, often see and hear what they want to see and hear. They interpret the world in self-serving terms. They view threats to the existing order with great alarm. Rather than investing real and psychological capital in the status quo, they would be far better served by adopting a "crisis imminent" mind-set, one that prepares them for an industry shakeout at any point. We take up later (chapter 8) the primary causes of market disruptions, but first we turn attention in chapter 2 to a deeper analysis of the triumvirates that dominate or that are in the process of forming in major industries throughout the free markets of the world economy. What is so special about the notion of three major players in a competitive market? Why are there sometimes more than or fewer than three in a given industry? And how does their dominance affect the typically smaller niche players that somehow find the means not just of surviving in a highly competitive market, but of doing quite well?
Copyright © 2002 by Jagdish Sheth and Rajendra Sisodia
Product Details
- Publisher: Free Press (May 1, 2010)
- Length: 288 pages
- ISBN13: 9781439172933
Browse Related Books
Resources and Downloads
High Resolution Images
- Book Cover Image (jpg): The Rule of Three Trade Paperback 9781439172933