A New Strategy Framework for Coping with Turbulence
In the past, strategy researchers have not focused on turbulent environments. Most of the extant frameworks in strategic management implicitly assume a benign environment that is simple and not very dynamic. Yet, recent advances in technology, coupled with a global political climate that is favorable to free markets, have made parts of many industries such as financial services, health care, and transportation more turbulent. In industries related to information and communication (what I call “Infocom”) traditional industry boundaries have disappeared. For firms competing in these industries, the organizational set, or the number of competitive forces that a firm faces, has expanded, and technological innovations have accelerated the rate of change. Coping with the resulting turbulence calls for a new approach to competitive strategy.
In their seminal paper on the causal texture of organizational environments, Emery and Trist classified a firm’s environment in terms of its complexity and dynamic.1 The larger the firm’s organizational set, the greater the complexity that it faces.2 Complexity is a measure of the number of competitive configurations that a firm must ideally consider in shaping its own strategy.
The dynamic of the environment, i.e., the rate at which these configurations change over time, is the other key determinant of turbulence. When a business environment is highly complex and changing rapidly, the resulting turbulence in a firm’s environment makes orderly conduct among its competitors more difficult.3
Here I focus on the turbulent environment of the newly emerging mega-industry, Infocom. I present a framework for conceptualizing the merging of its constituent industries and for coping with the resulting turbulence. The existing models of competitive strategy do not help firms seeking to compete in such environments. I offer an alternate framework that conceives of competitive strategy as flexible commitments — a paradoxical blend of early commitments (so vital for competitive success) and timely exits (crucial for managing risks).
I must note at the outset that not all Infocom firms experience turbulence; some are in simple, less dynamic niches. However, firms aspiring to be Infocom leaders are unlikely to operate in these niches. Market leaders must face and cope with turbulence by reconceptualizing strategy, sharing the responsibility for strategy more broadly within the firm, and focusing on organizational capabilities as the real source of competitive advantage.
Growing Turbulence in Infocom
There are four major clusters of Infocom industries: information providers, information processors, communication providers, and communication support (see Figure 1). Information providers include firms in the media, film, music, and publishing industries that are primarily concerned with creating and assembling new information. These firms also seek to add value by selectively archiving some of this information for future use. Information processors are firms that help manipulate information to serve individual user needs. Thus a typical information processor belongs to the computer, office equipment, or software industry. Their traditional focus on data is changing. Information processors are beginning to focus their attention on voice and moving images as well. Communication providers, which move information, include transporters of voice and data — the telephone companies (including cellular telephone operators) and distributors of books, movies, and other entertainment software — and the broadcasting and cable TV companies. Another cluster, communications support, includes firms that provide equipment and services to support the communication providers. These are firms from the telecommunication equipment manufacturing industry that provide switching and transmission gear and companies from the consumer electronics industry that provide the customer premises equipment (CPE) required to receive the transmitted information.
Mobility across some of these industries, even within the same cluster, was difficult until recently because of either technological differences or regulatory barriers. Today, it is not unusual for two distinct communication providers like a cable company and a telephone company to compete for the same customers. Intercluster competition is just as common, as in the case of a communications support company like Sony and an information processor like Apple. Impressive technological advances in the past decade have torn down many traditional barriers to cross-industry entry in Infocom. The growing political trend toward deregulation and market liberalization has been another contributing factor. In the United States, for example, the Telecommunications Act of 1996 promises to eliminate the barriers that currently separate the communication providers in Infocom. Worldwide, many of the regulations that restricted Infocom companies are also being progressively lifted.
Falling Entry and Mobility Barriers
Technological advances have lowered many entry and mobility barriers that have historically protected Infocom industries. Soon, a single portable handheld device will allow a user to make voice or videophone calls; pick up voice, video, or data mail messages; access and run remote computer application programs; browse through libraries or public databases; watch news broadcasts and films; and perhaps even play a computer video game. Information will be available via a communication utility in much the same way that electricity is today. Unlike electricity, however, the information on tap will be rich, diverse, and capable of being customized to suit a particular user’s needs. The provision of this new service will call for the seamless integration of various information forms like newspapers, books, scientific and business information, and a broadband, interactive communication channel.
Semiconductor devices can now process high-speed images and manage data communication interfaces at hundreds and thousands of megabits per second. As technological performance rises, the price of semiconductor devices continues to drop. Integration density and performance per chip has doubled every one and a half years, while prices have remained virtually the same. Trends show that, every seven years, the cost of processing has fallen by a factor of ten. Tasks that only large mainframes costing millions of dollars performed twenty years ago can now be done on machines costing a few thousand dollars; perhaps in another twenty years, the same power will cost only a few hundred dollars. This impressive improvement in performance, coupled with falling prices, has made it commercially feasible to extend digitization of information beyond data to include voice, video, and moving images.
Today, with an appropriate add-on tuner card, people can watch TV broadcasts on a home computer screen. Televisions, on the other hand, have been part of many home computer setups and integral to computer-game configurations. Soon, a single telecomputer will be a fax-phone, radio, a high-definition TV, an audio and video recorder/player, an image copier, and a computer. The television and home computer will increasingly resemble each other, with large data storage, processing, and interactive capabilities. Computer firms, such as IBM, Compaq, and Apple, and consumer electronics giants, such as Sony, Matsushita, and Philips, will increasingly compete with each other. Recent developments in Internet-related technologies will make this battle even more interesting. The telecomputer of the future might actually be a cheap device without large data storage and processing capabilities that relies instead on the Internet.
Similarly, computer and consumer electronics firms are now making telephones, answering machines, PABX systems, and sophisticated transmission and switching gear that were once made exclusively by telecommunication equipment manufacturers. Computer companies such as IBM and Hitachi are competing with Alcatel and AT&T for high-end telecommunication equipment like the asynchronous transfer mode (ATM) switch, and consumer electronics firms such as Sony and Philips compete with the telecommunication equipment manufacturers on telephones, answering machines, and low-end telecommunication systems.
In the communication provider industries, there have been equally impressive advances that make integration of voice, data, and video transmission possible. Frame relay and cell transfer technologies transmit a mixture of image, data, and voice through self-healing, highly reliable networks. They are capable of providing a bandwidth-on-demand service. B-ISDN (broadband ISDN) has already enhanced the capacity of telephone networks to carry data. Newer technologies like the asymmetric digital subscriber line (ADSL) make it possible to squeeze even more capacity from existing twisted-pair telephone wiring. The growing use of fiber optics in communication networks enhances the broadband capability that they require for full voice-data-video information integration. The cost of producing a meter of fiber has dropped dramatically from $11.00 in 1978 to an estimated $.50 by 1998.4 Latest advances in security technologies will facilitate the conduct of commercial transactions electronically over a public switched telephone network. Also, technological advances will allow wireless networks to emerge faster and offer the same capacity that wireline services offer today.
Microsoft is an example of a firm that has exploited the wide range of strategic alternatives that are now available in Infocom. The company began as an information processor, specializing in operating and applications systems software (see Figure 2). Recently, the company has made forays into other Infocom areas. For example, it has forged alliances with Sega, the video game maker, and DreamWorks SKG, the innovative film producer, to gain special access to distinct information content. The company has reportedly spent $200 million to help NBC launch an interactive news channel to compete with CNN, in exchange for exclusive on-line distribution rights to news produced by NBC. Microsoft has alliances with communication providers on interactive TV and various on-line services. Bill Gates also has an interest in Craig McCaw’s ambitious new venture, Teledesic, that will create a constellation of 840 low-altitude satellites that transmit signals from any point on the planet with the speed and capacity of fiber-optic cable. Finally, Microsoft’s alliance with General Magic makes Microsoft an important player in the CPE market as well.
Other firms such as Sony, Walt Disney, and Philips have also begun exploring multiple niches in Infocom, often in partnership with their competitors. The merits of this diversification strategy and the multiple alliances through which it is initiated are as yet unclear. Nevertheless, competing firms must take these strategic configurations into account in formulating their own strategies. Their competitive environment has become more complex.
Increasing Returns to Scale
The second feature of Infocom is the increasing returns to scale that this industry offers. As Arthur observes, parts of the economy that are resource-based (agriculture, mining, and so on) are subject to diminishing returns to scale.5 Products or companies that get ahead in such a market eventually run into limitations and reach a predictable equilibrium of prices and market shares. In contrast, in knowledge-based industries, instead of diminishing returns to scale, there are increasing returns. Increasing returns do not generate equilibriums but rather instability. The evolution of the microcomputer industry provides an illustration of Arthur’s thesis (see Figure 3).
The microcomputer or personal computer (PC) industry was born in the mid 1970s. Tandy, Commodore, and Apple all offered complete machines, but Apple’s user-friendly operating system gave it an early lead over its competitors. IBM did not enter the industry until 1980; by then, Apple had established itself as a market leader with 20 percent market share. To regain lost ground and blunt Apple’s building momentum, IBM built partnerships with both Intel and Microsoft to launch its PC.
IBM’s name recognition gave it early market penetration, and the growing base of IBM/DOS users encouraged independent software vendors to develop new application software for the IBM PC. The greater variety of application software encouraged additional users to try the IBM PC, which led to a virtuous cycle of more users and more application software, making it the industry standard, despite Apple’s arguably more user-friendly operating system.
IBM’s decision to ally with both Intel and Microsoft for critical components and software was crucial to its early success. However, the terms of the alliance did not restrict either supplier from providing components and software to other manufacturers, which gave birth to IBM clones. As the price of the cloned PCs started dropping, more users flooded the market. This, in turn, encouraged Intel, Microsoft, and the independent software vendors to set an ambitious development strategy, increasingly independent of IBM. The availability of user-friendly applications further boosted PC sales, but now the hardware itself had become a mere commodity. The profits had migrated from hardware manufacturers to suppliers like Intel and Microsoft and distributors like Dell.
Innovation Dynamics
It is difficult for any one Infocom firm to maintain its dominance. At various times, Apple, IBM, Compaq, and, more recently, Intel and Microsoft have been the major beneficiaries of the PC revolution. That is about to change as the Internet makes it possible for the PC to borrow tailor-made applications software from the network, thus eliminating the need to have stand-alone computing power and applications versatility — the staples on which Intel and Microsoft have built their success. The World Wide Web and its system software could become the next dominant platform, superseding the personal computer and its operating system. Today, the PC world is filled with many interlinked players, each betting that its next innovation will further boost the industry’s demand and give it a temporary competitive advantage before another innovation comes along.
The digitization of information allows information content providers and communication conduit companies to add value in several ways (Figure 4 shows three such approaches for a content provider). Some firms have focused on speed by providing near-instant business intelligence on a wide range of items from stock market performance, to patent and other scientific information, to presentations at public conferences. Others have customized information and tailored textbooks to suit individual instructors’ preferences. Electronic newspapers combine both speed and customization. Finally, a number of innovations seek to extend the half-life of information, i.e., how long it is valued, through a range of archival products from technologically “freshened” movie classics to new databases stored conveniently and inexpensively on CD-ROMs for personal use.
The information providers and processors will compete to control the giant share of the value added through innovation. When the source of information is in the public domain, as in the case of financial, scientific, or technical information, information processing companies can more easily control the value added. In the case of other information, especially pertaining to entertainment, the information providers will normally control the value added. However, information processing companies will be involved in developing the software needed to customize entertainment. Similarly, the communication providers and their respective support companies will compete to control the value added to the communication conduits (see Figure 5).
The value of a conduit is enhanced if it can be made specific to an individual consumer. Satellite broadcasting, for example, is a one-to-many communication conduit that is not specific to any viewer. However, through the use of antennas and decoders, as well as local communication links, satellites can target individual customers. This is, for example, the intent of the Teledesic project. A conduit is versatile if it can carry all forms of information — voice, data/ image, and video. While satellite transmission is extremely versatile, the conventional telephone conduit is not; it was designed primarily for voice transmission. But technological advances have made it capable of handling data/image transmission and, more recently, even video signals, using new signal-compression technologies. The investment of telephone companies in fiber-optic cables and high-speed digital switches such as ATM should allow them to offer a fully versatile conduit. ATM switch technology breaks messages into packets and stuffs data packets into spaces between voice and video packets, thus expanding the carrying capacity of a telephone line.
Cable TV, another conduit alternative, offers specificity and versatility but does not have interactivity, the third important attribute in a conduit. A conduit is interactive when it allows two-way communication. This is not an inherent limitation of the cable network per se but is due to the lack of necessary broadband switches in the network and suitable communication equipment in the home. Radio or cellular transmission is the most specific conduit. The communication link is wherever the subscriber chooses to be — at work, at home, or in the car. Cellular radio communication is also interactive, but its versatility is currently somewhat limited. Future technological advances are expected to remove this limitation.
The communication providers prefer to have the necessary capability for improving the versatility, specificity, and interactivity of their conduits in their central office equipment. The communication support companies, on the other hand, prefer to create these features in CPEs. By going directly to the end user, the equipment manufacturers can bypass the power of well-entrenched communication providers.
The Internet, an open computer network already connecting some 40 million users worldwide, has blurred the distinction between computing and communication. With the help of Java, a programming language specially designed to run on a network, the Internet can potentially bring the power of a supercomputer to every desktop. The Internet is hardware independent and equally accessible to any terminal, whether a PC or Mac. It also liberates the user from the costs and hassles of frequent software updates; the user has access to the latest version of the software on the Internet. More importantly, the Internet will allow users to pay only for the software power that they actually use, rather than pay up front for a host of obscure applications (called “bloatware”) that the average person never uses. Industry watchers see a communications network that will become the primary repository of data and applications software; all a user will need for access is a $500 terminal, which even the game machine companies Nintendo and Sega can make.
The possibilities of the Internet clearly worry some industry leaders. Andrew Grove and Bill Gates, whose strategies have been aimed at increasing the power of the stand-alone desktop computer, point out that the low-cost, high-bandwidth, interactive communication pipelines needed to make the Internet a serious threat will be unavailable to most users globally, even by the year 2000.6 Faster microprocessors and clever software will still be required to harness the Internet’s power.
Actors in each Infocom cluster have different visions for its future. Each hopes to identify the best value-adding opportunities, while reducing their rivals (whether information providers, processors, communication providers, or support companies) to the status of commodity suppliers. Given the matched strengths of the rivals, there are no obvious winners. The ensuing battle for value added will accelerate the rate of change for Infocom firms.
Coping with Turbulence: The Relevance of Existing Frameworks
Falling entry and mobility barriers, increasing returns to scale, and frequent innovations are trends that researchers have observed in other industries. D’Aveni defines “hypercompetition,” a condition similar to the turbulence I described, as an environmental phenomenon in which competitive advantages are rapidly created and eroded because of dynamic strategic interactions in four areas of competition: (1) cost and quality, (2) timing and know-how, (3) strongholds, and (4) deep pockets.7
In a hypercompetitive environment, advantages due to a low-cost position or superior features (what D’Aveni calls quality) are constantly challenged. In the case of Infocom, technological innovation allows features to improve constantly even as costs drop. Also, as I discussed in the previous section, frequent technological innovation quickly makes obsolete competitive advantages accruing from specialized know-how. Traditional barriers built around switching costs and captive distribution are being challenged by an open information architecture and access to a broadband communication utility. While financial resources continue to be a key to success in Infocom, given the many deep-pocket players that participate in it, they are no longer a source of competitive advantage. Infocom would thus appear to have all the features of a hypercompetitive environment.
However, turbulence and hypercompetition are not synonymous. In a turbulent industry, there are multiple, unpredictable equilibrium points. D’Aveni sees hypercompetition as a desirable stop on the inevitable descent to perfect competition:
“Thus, even though perfect competition is treated as the ‘equilibrium’ state in static economic models, it is neither a desired state nor a sustainable state from the perspective of corporations seeking profits. They would prefer low and moderate levels of competition but often settle for hypercompetitive markets because the presence of a small number of aggressive foreign corporations won’t cooperate enough to allow the old, more genteel levels of competition that existed in the past.”8
Competitors driving toward a single equilibrium do not cause turbulence. The positive feedback characteristics of Infocom allow several profitable equilibrium points. Companies participate with the hope of reaching one of these many equilibrium points, and the difficulties in distinguishing between the winning and losing trajectories delays their exit. Multiple competitors that behave unpredictably cause turbulence. Hypercompetition assumes a less complex environment by contrast, involving conflicts within and across the four areas of cost and quality, timing and know-how, strongholds, and deep pockets. Some popular frameworks for formulating competitive strategy assume an even more benign environment (see Table 1).
The Porter Framework
Porter’s framework is useful only if the competitive forces represented by competitors, suppliers, buyers, and substitutes are relatively stable and independent.9 Then a company can find an appropriate strategy for each industry configuration and erect the necessary barriers for protecting this strategy.
The traditional sources of competitive advantage — economies of scale, product differentiation, capital investments, switching costs, access to distribution channels, and government policy — have all lost their importance as barriers to competition in Infocom.10 Technological change quickly makes many of these barriers obsolete, capital resources are easily available to many large Infocom players either individually or in alliances, and government policy assumes a diminishing role in the Infocom industries.
The Hamel and Prahalad Approach
Hamel and Prahalad argue that the role of strategy should be not to accommodate an existing industry structure but rather to change it.11 They see the role of competitive innovation as identifying the orthodoxy in an incumbent’s strategy and redefining the terms of engagement to exploit this orthodoxy. Of particular relevance to the Infocom industry is their example of a poorly resourced competitor, Canon, which successfully attacked Xerox’s dominance in the office products industry. Canon’s innovation of decentralized copying was driven primarily by its unreasonable aspiration, what Hamel and Prahalad term “strategic intent,” to vanquish Xerox, supported by its ability to leverage its available core competencies in optics and microelectronics, and its foresight to layer these with other required competencies.
But underneath Canon’s war cry of “Beat Xerox” was the assumption that Xerox would stay its course. Canon benchmarked the terms of engagement on the quality, reliability, and product variety of Xerox. The Canon innovation, at least initially, was to offer the same or reduced functionality at substantially lower prices than the industry leader’s. But Infocom is not evolving as predictably, and its leaders will not sit still for as long as Xerox did.
The D’Aveni Framework
Frameworks for competitive strategy based on input/ output economics, exemplified by Porter’s model, emphasize the important role of industry structure in determining a firm’s strategy. Other competing frameworks question the determinism implicit in this approach and focus instead on the firm — its competencies, organizational capabilities, and, above all, its managerial ingenuity — as the key drivers of its strategy. The appeal of the Hamel and Prahalad concepts of competitive innovation and competence leverage is, in part, their promise for overcoming difficult industry environments and tight resource conditions. However, even these brilliant strategic victories can be short-lived in a hypercompetitive environment.12
D’Aveni proposes the so-called “New 7S Framework” to deal with the fleeting nature of competitive advantage. At its essence, the framework assumes that every new strategy will beget quick competitive retaliation, soon neutralizing the bases for competitive advantage. D’Aveni, therefore, prescribes a strategy that is continually seeking to change the rules of the game, even when a firm is successful and ahead of its competitors. He extends the Hamel and Prahalad notion of competitive innovation to a continual process rather than a one-time breakthrough. However, in departing from Hamel and Prahalad, he proposes that a firm should carefully guard its strategic intent from its competitors and seek instead to confuse them. Speed and surprise are the two competitive advantages in the D’Aveni framework. The primary driver of strategy is strategic soothsaying and the anticipation of future customer needs through a constant monitoring of stakeholder (especially customer) satisfaction.
Whereas the D’Aveni framework is refreshingly new in its emphasis on competitive dynamics, it has two limitations. First, it assumes that strategic soothsaying is possible, and second, it assumes that competitive battles are won or lost by the firm’s own actions in the four areas of competition: cost and quality, timing and know-how, strongholds, and deep pockets. Especially troubling is D’Aveni’s reference to Microsoft as a hypercompetitive firm.13 While D’Aveni is right in pointing out that competitive advantage in a hyper-competitive environment is fleeting, his belief that a firm can continuously move from one advantage to the next is misplaced. It is highly unlikely that any firm, even one as capable as Microsoft, can achieve such an uninterrupted success in a truly turbulent environment. Arthur points out that, while “corporate maneuvering” (an economist’s term for strategy) can be helpful, winners in a turbulent environment are also picked by external circumstances and luck.14
It is not then a choice between a home run and a series of base hits, as D’Aveni puts it; turbulence will cause firms to strike out as well.15 While some firms may, on average, cope better with turbulence, they cannot do so consistently. When asked how a corporation can survive in a positive-feedback environment, Arthur noted more humbly: “The essence of surviving in a positive-feedback environment is to be highly adaptive. If the flow is in your direction, go with it; if it isn’t, don’t resist — retreat.”16
Companies cannot manage turbulence, but they can cope with it. Next I present a framework for coping with turbulence.
Toward a New Framework for Competitive Strategy
The three elements of my proposed framework are: reconceptualizing strategy, sharing the responsibility for strategy more broadly within the firm, and focusing on organizational capabilities as the real source of competitive advantage.
Reconceptualizing Strategy
· Repeat First Mover.
A strategy for coping with the growing turbulence of Infocom must rely on changing the rules of the game not once but repeatedly. Edward McCracken, president of Silicon Graphics, advocates creating chaos to cope with chaos: “The key to achieving competitive advantage isn’t reacting to chaos; it’s producing that chaos. And the key to being a chaos producer is being an innovation leader.”17
While some would question whether Silicon Graphics has lived up to its philosophy of being an innovation leader, the idea of repeat innovation as a strategy for coping with industry turbulence is appealing. On the face of it, the idea seems to contradict the previous theorizing on first-mover advantages. In their thoughtful survey on first-mover advantages (and disadvantages), Lieberman and Montgomery note that a first-mover strategy has severe disadvantages when there is the possibility to “free ride” on an innovation, the technological and commercial uncertainties are high, and technology or customer needs are fluid.18 This seems to apply to the Infocom industries. However, unlike other previously studied industries, the life cycle of an innovation in Infocom is very short —a few months in some cases. The fundamental support for a first-mover strategy comes from this ever-present threat of substitution.
A strategy is viable if it can be defended by distinct resources. Distinction is defined by the degree of difficulty in procuring, imitating, or substituting these resources.19 In manufacturing industries, the primary resources that are typically used to gain competitive advantage are tangible and intangible assets, which are difficult to either procure or imitate. For example, Caterpillar used its brand name, manufacturing capacities, sales network, and worldwide spare-parts distribution and servicing capabilities as distinct barriers that kept competition at bay. In contrast, in the knowledge-based Infocom industries, the resources that give a firm competitive advantage are its know-how and skills. Because of the fast pace of technological change in Infocom, competencies are often not imitated but, rather, substituted. In the earlier PC industry example, IBM succeeded in displacing Apple by introducing a new PC architecture. Compaq, a player with substantially lower assets, displaced IBM, again through an innovative product design. The more recent challenges to the PC come not from asset-rich competitors but from industry upstarts like Sun and Netscape.
Unlike the competitive context of manufacturing industries, in which the challenger avoids head-on competition until it has assembled an asset base comparable to that of the industry leader, challengers in the Infocom sectors do not show any such courtesy to the leader. Each successive challenger takes the industry leader head-on by offering a product or service that offers the same or better performance at a substantially lower price. Maintaining leadership in a turbulent industry, under such constant assault, requires repeat innovation. Canon, for example, launched its inkjet printers despite the damage to the company’s dominant position in laser printers. If it had not done so, a competitor would have. The failure of Apple, IBM, and, more recently, Compaq to hold on to their leads in the PC industry is due in part to their inability to make their strategies obsolete quickly enough before being attacked by a competitor.
· Managing Network Effects.
While entering a market first is a necessary condition for success, it is insufficient. First movers are not always guaranteed success. Prodigy, for example, was the first to offer on-line services but has since lost to America Online and CompuServe. In addition to being a first mover, a firm aspiring to lead in a turbulent environment must also be quick to build a customer network centered on its product or service offering. The resulting “network effect” can lead to a virtuous cycle beneficial to the nodal firm.
For example, when Novell introduced NetWare, a network operating system for connecting PCs in a local network, it first made sure that its product was technically superior. It then discounted the product heavily to build an installed base of users and also set up special incentives to encourage software developers to write for NetWare. The more software that was available for NetWare, the more popular it became, benefiting both Novell and its software developers. Novell managed the cross-product positive feedbacks actively to lock in its market and, subsequently, made huge profits from upgrades, spin-offs, and applications of its own.
Clearly, larger networks have an ineluctable advantage over small networks. The per-user costs to a provider do not increase with the number of users; in fact, they may drop. Also, networks with the largest number of consumers always provide more value to consumers than do smaller networks. Therefore, consumers may be willing to pay higher prices to join a large network than they would pay to join a small network. Thus profit per user must continually increase with the number of users. As more and more users flock to it, there is a product or service “lock in.”
Some have argued that the network effects may lead to the lock-in of inferior technologies.20 The lock-in of JVC’s VHS format for videotapes instead of Sony’s Betamax format or the dominance of the Qwerty keyboard over the rival Dvorak keyboard are examples. Liebowitz and Margolis argue that, contrary to some suggestions, VHS and Qwerty were not inferior to the competing technologies then available.21 Despite enjoying a monopoly for nearly two years, Sony lost to JVC because of customer preference for VHS’s longer recording time rather than the easier transportability of Betamax. Similarly, they point out that most of the claims of Dvorak’s superiority are dubious and can, in fact, be traced back to its patent owner, August Dvorak. For the Infocom industries, well-informed consumers are unlikely to have sustained loyalty for an inferior technology.
While it is important for a first mover to build a customer network early in the innovation process (without waiting for the perfect technical offering), its technology must be at least on par with that of its competitors. Moreover, it must continually upgrade its product or service offering to prevent customers from switching to rival offerings. Microsoft, for example, has been able to maintain its lead in operating systems by offering easy upgrades from DOS to Windows to Windows 95 to Windows NT. A customer network is not an asset that a firm owns but is a fickle system of influence that the firm must nurture continuously through its own offerings and the complementary offerings of its alliance partners. As Arthur observes, technologies do not exist alone but in an interlinked web or ecology.22 Left unmanaged, the network can become useless very quickly. On the other hand, as the Microsoft example suggests, a well-managed network can be a powerful launching pad for related products and services.
· Going with the Flow.
If network effects are important to an innovation’s success, it follows that it is very important for a company to sense the flow of each innovation and avoid entry when a product market is close to lock-in. Steve Jobs’s NeXT workstation, for example, was technically good but not distinctive enough to overcome the “network effects” of Sun Microsystems and Hewlett-Packard in this market.
An early exit when the market momentum is unfavorable may be equally important. When Microsoft was devising its own on-line strategy, it saw the Internet as a competitor to Microsoft Network (MSN). When the company realized that its MSN-centric approach would lead to a proprietary strategy isolating MSN from the rest of the on-line world, it quickly embraced the Internet and adopted its standards. It licensed Hot Java from Sun Microsystems and promised that its own Blackbird suite of on-line development tools would be Web-compatible. While Microsoft may not dominate the Internet as it did the PC market, its quick about-face has at least given the company a fighting chance to compete with Sun and Netscape.
We can compare a first-mover initiative to a financial call option.23 Options create flexibility and, in a turbulent world, the ability to value and use flexibility is critical. But commitment is equally important. Being a first mover creates options for a firm, but it has to be backed by investments in a customer network in order for the innovation to pay off. The challenge for a firm is to be both flexible and committed. Ghemawat identifies the trade-off in what he calls the ratio of the learn rate — the rate at which useful feedback is received on whether the chosen course of action is right — and the burn rate — the rate at which commitment to the chosen course of action is accumulating.24 Strategic investments in support of innovative strategies and in the networks that are required to sustain them are acceptable only as long as the ratio of learn rate to burn rate is high. Straightforward as this may sound, abandoning a strategy in midstream is not easy. Looking at strategic investments as call options is useful in this regard. After all, not all options make money. But, then, incentives systems must begin to reward senior managers for timely exits from markets when the flow is unfavorable to the firm, even as they do now for timely entries into markets when the flow is favorable.
Sharing Responsibility for Strategy
· Guiding Philosophy.
I suggested earlier that coping with turbulence calls for repeat innovation. Hamel and Prahalad propose that such innovation in a firm is sparked by a stretching strategic intent.25 However, it is difficult to define an enduring strategic intent in a turbulent environment. Worse still, such an intent may lock the firm into an investment path that is unsuitable to the changed conditions of its environment. For example, was NEC wise in staying with its computers and communications (C&C) mission? The uncoupling of AT&T’s communication and computer businesses raises questions about the synergies between the two. Perhaps NEC would have been better off if it had moved from hardware to the more lucrative software and information sectors of Infocom, as Sony has done.
Collins and Porras distinguish between a firm’s guiding philosophy and its tangible image.26 They define the guiding philosophy as a system of fundamental motivating assumptions, principles, values, and tenets — the purpose of an organization and its core beliefs and values. Purpose is the broad arena in which the firm seeks to contribute to society, whereas its core beliefs and values define how it will achieve its purpose. Tangible image, on the other hand, consists of a mission that clearly focuses the organization’s efforts and a vivid description by which the mission becomes alive and engaging. The concept of “strategic intent” is ill defined, but based on the examples that Hamel and Prahalad provide, strategic intent appears to be what Collins and Porras call a tangible image.
In turbulent environments, a tangible image can be useful if it is seen as temporal. It is a map, although flawed, that helps focus an organization’s energies in the face of ambiguity.27 A successful coping strategy in a turbulent environment requires constant experimentation with several images until the firm finds a successful flow. Holding on to one tangible image, no matter how carefully crafted, can trap a firm in a market that it cannot dominate. There is a growing chorus of Infocom CEOs, including Andrew Grove and Lou Gerstner, who question the value of their firms’ strategic intent. What might be more useful is a less structured guiding philosophy. For example, Silicon Graphics’ guiding philosophy is a computer screen that will be a window into a virtual world. Unlike strategic intent, this vision is vague; there are no explicit or implicit enemies that have to be vanquished. The vision merely states an industry niche, visual computing, that the company is betting on. It tells employees where they should look for opportunities.
· Context Awareness.
By relying on a guiding philosophy rather than a tangible image or strategic intent, top management is, in fact, democratizing the strategy-making process. Strategy making is no longer a top-down process; the impetus for new strategic innovations will have to come from bottom-up entrepreneurship. Many firms in Infocom have flattened their organizations, not so much to eliminate middle managers or take advantage of advances in corporate information systems, but to create the small, highly focused units that must innovate.
How can a CEO add value to the strategy-making process in a flat organization? The literature on strategy making in new-venture divisions is relevant. Burgelman, in describing the social learning process associated with the development of a new venture, connects the opportunistic actions of front-line managers with the vision of top management.28 The initial uncertainty concerning outcomes and the ambiguity of fit with the corporate vision is progressively reduced through an intense process of communication between frontline managers and senior management. Successful entrepreneurship requires both the willingness to experiment outside a plan and the ability to communicate freely and debate openly the value of the resulting outcomes. A firm’s fuzzy vision and opportunistic actions must be reconciled in an ever-changing array of tangible images.
This continuous reconciliation would be impossible without a top team that is well versed in both the context of the firm’s business and its technologies. Industry leaders like Andrew Grove, Bill Gates, Scott McNealy of Sun Microsystems, James Clark of Netscape, Larry Ellison of Oracle, Steven Case of America Online, and Craig McCaw of Teledesic represent this new breed of CEOs. They may not shape every innovation in their firms, but they certainly have the savvy to evaluate innovative ideas that their frontline managers propose. They also can sense whether the firm’s innovations are in the flow; they are chief vigilance officers. Grove helped move Intel out of the storage device business; and Gates was behind Microsoft’s recognition of the Internet’s possibilities and its decision to trim the company’s ambitions for its own on-line strategy.
Focusing on Organizational Capabilities
The typical bases for a firm’s competitive advantage are its distinctive competencies.29 However, many firms competing in Infocom have deep pockets and are undifferentiated in their distinctive assets or skills. In such an environment, what gives a firm a competitive advantage are its organizational capabilities for leveraging, strengthening, and diversifying these resources.30 Leveraging refers to the firm’s ability to share and exploit its competencies in the pursuit of new opportunities. The current strategy literature has discussed the importance of competence leverage at length.31 What needs more emphasis is the strengthening and diversification of competencies.
Competence strengthening is primarily the articulation of the firm’s tacit know-how and the combining of multiple competencies to build “metacompetencies.” Many successful international competitors such as Sony in consumer electronics have built their successes not on any single distinctive competence but rather on their ability to combine competencies (in which they were not world leaders) into a distinctive metacompetence, such as miniaturization skills. Competence diversification, on the other hand, is the importing of skills, knowledge, and assets from other firms, building new tacit skills and know-how within the firm, and retiring an old competence. Without retiring old competencies, a firm may have difficulty developing or assimilating new competencies. Intel, for example, had to leave the storage device business before it could establish its world-class superiority in microprocessors.
Leveraging, strengthening, and diversification of competencies can often be at odds with each other. Organizational processes that facilitate leveraging do not provide the autonomy and time required for strengthening a competence. Likewise, an emphasis on diversification can distract from competence strengthening. In companies such as Sharp, competence strengthening is the responsibility of top-level functional committees. Central R&D and product group heads are responsible for competence diversification. Every business unit manager is responsible for competence leveraging. Top management balances the relative emphasis on the three competence management capabilities by balancing the power of the various organizational units.
Conclusion
The emerging Infocom mega-industry provides an interesting laboratory for testing the adequacy of existing frameworks for competitive strategy. I suggest that they are not designed to deal with the kind of turbulence that we are witnessing in this industry. Strategists may have to assume a humbler role in dealing with turbulence. While being a first mover and an innovator will help, it is insufficient. Investing in and growing a customer base can enhance the chances of success, but success in the end is determined by industry forces outside the firm’s control. “Go with the flow” is not an inspiring strategy but perhaps the best a firm can do when confronted with turbulence.
While victory is far from assured for every innovation, it appears that market leadership will go to the firm that is a repeat innovator. Each innovation should be treated as an investment in a call option — sometimes it will make money and other times not. But the ability to develop and preserve these options is vital to long-term success and keeps a firm flexible.
To support flexibility, a firm needs to rely more on its front-line entrepreneurs who are close to the business pulse and can sense the flow of innovation. Constraining them with top-down strategic intent can be counterproductive, but their innovative ideas need to be channeled within a guiding philosophy — a broad vision of the opportunities that the firm seeks to participate in. Also, the added latitude given to frontline managers must be counterbalanced by a context-aware top management, which must know the business and be familiar with its supporting technologies.
Finally, a firm’s true competitive advantage for coping with turbulence is not in its current distinctive competencies, but in those that it can grow tomorrow. A firm’s organizational ability to leverage and strengthen existing competencies is important, but it must be equally adept at diversifying its competence base. Top management’s skills in managing the tensions among these dynamics are a firm’s real source of competitive advantage. Instead of worrying about designing the right strategic architecture, top management should devote more energy to building a proper organizational architecture.
References
1. F. Emery and E. Trist, “The Causal Texture of Organizational Environments,” Human Relations, volume 18, February 1965, pp. 21–32.
2. W. Evan, “The Organizational Set: Toward a Theory of Interorganizational Relations,” in J. Thompson, ed., Approaches to Organizational Design (Pittsburgh: University of Pittsburgh Press, 1966).
3. R.H. Miles, Macro-Organizational Behavior (Santa Monica, California: Goodyear Publishing, 1980).
4. G. Gilder, “Into the Telecosm,” Harvard Business Review, volume 69, March–April 1991, p. 155.
5. W.B. Arthur, “Increasing Returns and the New World of Business,” Harvard Business Review, volume 74, July–August 1996, pp. 100–111.
6. B. Schendler, “A Conversation with the Lords of Wintel,” Fortune, 8 July 1986, p. 46.
7. R.A. D’Aveni, Hypercompetition: Managing the Dynamics of Strategic Maneuvering (New York: Free Press, 1994).
8. Ibid., p. 29.
9. M.E. Porter, Competitive Strategy: Techniques for Analyzing Industries and Competitors (New York: Free Press, 1980); and
M.E. Porter, Competitive Advantage (New York: Free Press, 1985).
10. Porter (1980).
11. G. Hamel and C.K. Prahalad, “Strategic Intent,” Harvard Business Review, volume 67, May–June 1989, pp. 63–76;
G. Hamel and C.K. Prahalad, “Strategy as Stretch and Leverage,” Harvard Business Review, volume 71, March–April 1993, pp. 75–84; and
C.K. Prahalad and G. Hamel, “The Core Competence of the Corporation,” Harvard Business Review, volume 68, May–June 1990, pp. 79–91.
12. D’Aveni (1994).
13. Ibid., p. 2.
14. W.B. Arthur, “Positive Feedbacks in the Economy,” McKinsey Quarterly, number 1, 1994, p. 81–95.
15. D’Aveni (1994), p. 15.
16. Arthur (1994), p. 90.
17. S.E. Prokesch, “Mastering Chaos at the High-Tech Frontier: An Interview with Silicon Graphic’s Ed McCracken,” Harvard Business Review, volume 71, November–December 1993, p. 136.
18. M.B. Lieberman and D.B. Montgomery, “First-Mover Advantages,” Strategic Management Journal, volume 9, 1988, pp. 47–58.
19. I. Dierickx and K. Cool, “Asset Stock Accumulation and Sustainability of Competitive Advantage,” Management Science, volume 35, December 1989, pp. 1504–1511.
20. See, for example:
Arthur (1996).
21. S. Liebowitz and S. Margolis, “Don’t Handcuff Technology,” Upside, September 1995.
22. Arthur (1996).
23. A.K. Dixit and R.S. Pindyck, “The Options Approach to Capital Investment,” Harvard Business Review, volume 73, May–June 1995, pp. 105–118.
24. P. Ghemawat, Commitment: The Dynamics of Strategy (New York: Free Press, 1991).
25. Hamel and Prahalad (1993).
26. J.C. Collins and J.I. Porras, “Organizational Vision and Visionary Organization,” California Management Review, volume 34, Fall 1991, pp. 30–52.
27. M.B. McCaskey, The Executive Challenge: Managing Change and Ambiguity (Boston: Pitman, 1982).
28. R.A. Burgelman, “Strategy Making on a Social Learning Process: The Case of Internal Corporate Venturing,” Interfaces, volume 18, May–June 1988, pp. 74–85.
29. Prahalad and Hamel (1990).
30. Y. Doz and B.S. Chakravarthy, “Managing Competence Dynamics” (Mexico City: Strategic Management Society Conference, paper, 1995).
31. Hamel and Prahalad (1993).