Measuring and Managing Technological Knowledge
“Knowledge is power.” — Francis Bacon
As we move from the industrial age into the information age, knowledge is becoming an ever more central force behind the competitive success of firms and even nations. Nonaka has commented, “In an economy where the only certainty is uncertainty, the one sure source of lasting competitive advantage is knowledge.”1 Philosophers have analyzed the nature of knowledge for millennia; in the past half-century, cognitive and computer scientists have pursued it with increased vigor. But it has turned out that information is much easier to store, describe, and manipulate than is knowledge. One consequence is that, although an organization’s knowledge base may be its single most important asset, its very intangibility makes it difficult to manage systematically.2
The goal of this paper is to present a framework for measuring and understanding one particular type of knowledge: technological knowledge, i.e., knowledge about how to produce goods and services. We can use this framework to more precisely map, evaluate, and compare levels of knowledge. The level of knowledge that a process has reached determines how a process should be controlled, whether and how it can be automated, the key tasks of the workforce, and other major aspects of its management. Better knowledge of key variables leads to better performance without incremental physical investment.
Two examples illustrate the importance of technological knowledge in the form of detailed process understanding. Chaparral Steel, a minimill, was able to double output from its original electric furnace and caster. Semiconductor companies routinely increase yields on their chip fabrication lines from below 40 percent to above 80 percent during a period of several years. In these cases, the incremental capital investments are minimal. The improvements are instead due to multiple changes in the manufacturing process, including different procedures, adjustments of controls, changes in raw material recipes, etc. Why weren’t these changes implemented at startup? The reason is that the knowledge about the process and how to run it is incomplete and develops gradually through various kinds of learning.
Many authors have noted that there is a difference between data and information. A few have also noted that there is a difference between information and knowledge.3 Although not always clear-cut, the distinction among the three in production processes is very important. Data are what come directly from sensors, reporting on the measured level of some variable. Information is “data that have been organized or given structure — that is, placed in context — and thus endowed with meaning.”4 Information tells the current or past status of some part of the production system. Knowledge goes further; it allows the making of predictions, causal associations, or prescriptive decisions about what to do.
For example, consider a stream of measurements of the critical dimension of a series of supposedly identical manufactured parts — raw data. If the data are plotted on a control chart, they provide information about the status of the production process for those parts. The measurements may have a trend, may be beyond the process control limit, may be out of the allowed tolerance, or may even show no discernible pattern. All of these are information, but not knowledge. Knowledge about the process might include, “When the control chart looks like that, it usually means machine A needs to be recalibrated” (causal association and prescriptive decision), or “When the control chart is in control for the first hour of a new batch, it usually remains that way for the rest of the shift” (prediction). This paper is about technological knowledge, not data or information.
To explain why some types of knowledge are more complete and useful than others, a colleague and I developed an ordinal scale for describing how much is known about a process. Originally we studied ramp-up of new production in high-tech industries (VLSI fabrication, hard disk drives). Subsequently, we found that the same concepts worked well in traditional industries such as firearms, pulp and paper, and steel cord.5
In the next section, I give a detailed scheme for measuring the extent of technological knowledge and several brief examples, ranging from semiconductors to consulting. The third section examines the implications of the level of knowledge for how to manage production processes. The fourth section looks at learning, i.e., the evolution of knowledge over time. In the penultimate section, I use a familiar technology, baking, as an extended illustration. In conclusion, I look at some of the implications for managing technological knowledge itself.
A Scale for Measuring Knowledge about a Process
A company’s knowledge about its processes may range from total ignorance about how they work to very formal and accurate mathematical models.6 For our purposes, a process is defined as any repetitive system for producing a product or service, including the people, machines, procedures, and software, in that system. A process has inputs, outputs, and state variables that characterize what is happening inside it. The inputs are often further broken down into raw materials, control variables, and environmental variables (see Figure 1). For example, environmental variables include temperature, humidity, air pressure, dust, seismic vibration, electrical power, etc.
Here I define technological knowledge as understanding the effects of the input variables on the output. Mathematically, the process output, Y, is an unknown function f of the inputs, x: Y=f(x); x is always a vector (of indeterminate dimension). Then technological knowledge is knowledge about the arguments and behavior of the function f(x).7 The manager’s or process engineer’s goal is to manipulate the raw materials, controls, and environment to get output that is as good as possible. It is customary to treat the environmental variables as exogenous and uncontrollable. However, with enough knowledge, the environmental variables can be turned into control variables and, therefore, are not exogenous.
I start by looking at well-defined manufacturing processes such as building a car door or cooking in a fast-food restaurant. Later I will show how knowledge about less tangible processes, such as marketing and legal services, can be described by the same scale. Whatever the process, better technological knowledge gives the operators better ability to manage the process effectively.
I have identified eight stages of technological knowledge, ranging from complete ignorance to complete understanding. Each stage describes the knowledge about a particular input variable xi’s effect on the process output, Y. Why so many stages? We are used to the idea of a spectrum of knowledge “from art to science,” but intuition suggests that only three or four stages should be sufficient to describe the spectrum. Most analyses of production processes, however, look only at things that are already reasonably well understood. Variables in the first three stages are usually considered exogenous, in that it is impossible to control them. Nonetheless, it is important to recognize their existence since important variables may be at one of those stages, and management of the process needs to take that into account. The stages are summarized in Table 1.
In contrast to most approaches for measuring knowledge, the nature of the knowledge changes qualitatively with each stage in this framework. The process of learning from one stage to the next also changes. Each stage is described as follows:
Stage One — Complete ignorance. You do not know that a phenomenon exists, or if you are aware of its existence, you have no inkling that it may be relevant to your process. The history of technology is full of phenomena that were initially not recognized, yet had potentially major effects on a production process (e.g., quantum mechanics, germs in the treatment of wounds, contamination in a number of processes). At stage one, there is nothing you can do with the variable, and its effects on the process appear as random disturbances.
Stage Two — Awareness. You know that the phenomenon exists and that it might be relevant to your process. There is still no way to use the variable in your process, but you can begin to investigate it in order to get to the next stage. Learning from stage one to stage two often occurs by serendipity, by making analogies to seemingly unrelated processes, or by bringing knowledge from outside the organization.
Stage Three — Measure. You can measure the variables accurately, perhaps with some effort. This requires development and installation of specific instrumentation. Stage three variables cannot be controlled. However, if the variable is important enough, you can alter the process in response to the variable in order to exploit or ameliorate its effects. An example of a stage three variable is weather; many outdoor processes are halted or done differently during bad weather.
There are two kinds of learning at stage three. One kind consists of passive, natural experiments to determine the relationship between this variable and the output. A second learning process studies ways of controlling the variable to reach stage four, control. Knowledge about how to control the variable is, in effect, a subprocess with its own inputs and output (the level of the input variable for the main process). For certain variables, knowing how to measure it (stage three) leads almost automatically to knowing how to control it (stage four). These are primarily variables where feedback-based control is feasible, such as furnace temperatures.
Stage Four — Control of the mean. You know how to control the variables accurately across a range of levels, although the control is not necessarily precise. That is, you can control the mean level, but there is some variance around that level. Stage four provides a quantum leap in process control, since, at a minimum, you can now stabilize the process with respect to the mean of that variable. Variables that were previously viewed as exogenous disturbances to the process can now be treated as control variables. Reaching stage four also makes further learning easier, because you can now perform controlled experiments on the variable to quantify its impact on the process.
Stage Five — Process capability (control of the variance). You can control the variables with precision across a range of values. When all of the important variables reach stage five, your process can manufacture products by following a “cookbook,” i.e., a consistent recipe. The product still may not meet quality standards, however, so final inspection will be needed.
Learning from stage four to stage five is a matter of learning to control the various disturbances that affect the input variable. This is a nested subproblem that passes through the stages of knowledge on the way to good control of the input variable. That is, producing the correct level of an input, x, is a process in its own right and must be learned. Fortunately, accumulated technological knowledge gives cookbook methods for controlling many variables. The process engineer can look it up in a catalog or handbook. This means that you do not have to “reinvent the wheel” each time; you just have to learn enough to control the variable using known “wheels.”
Stage Six — Process characterization (know how). You know how the variable affects the result, when small changes are made in the variable.8 Now you can begin to fine-tune the process to reduce costs and to change product characteristics. You can also institute some feedback control on the output using any stage six variable that is both easy to change and has a major impact. This increases the quality of the output by reducing its variability. To reach stage six, you run controlled experiments with different levels of the variable to determine its effects.
Stage Seven — Know why. You have a scientific model of the process and how it operates over a broad region, including nonlinear and interaction effects of this variable with other variables. At this stage, you can actually optimize the process with respect to the stage seven variables. Feedback and some feed-forward control are broadly effective. Control can be turned over to microprocessors, which will be able to handle most contingencies. You can even use your knowledge to simulate the process to study settings you have never tried empirically, such as ways of making new products using the same process. Learning from stage six to stage seven involves tapping scientific models, running broad experiments across multiple variables to estimate the models, and finding interactions among input variables.
Stage Eight — Complete knowledge. You know the complete functional form and parameter values that determine the result, Y, as a function of all the inputs. Process and environment are so well understood that you can head off any problems in advance by feed-forward control. Stage eight is never reached in practice because it requires knowing all the interactions among variables. However, it can be approached asymptotically by studying the process in more and more detail.
The stages of knowledge can be applied to diverse tasks and industries:
- High-tech manufacturing requires rapid learning about multiple variables in new products and processes. We can frame a definition in terms of the stage of knowledge: high-tech processes are those in which many of the important variables are at stage four or below. This makes the process difficult to control and work with, so a lot of effort goes into raising the knowledge level as quickly as possible. Because of customer and competitive pressures, no sooner is knowledge raised for one product than higher performance products are demanded, which brings in new low-stage variables. Thus managing in high-tech industries requires both rapid learning and the ability to manufacture with “immature” (low stage of knowledge) technologies.
- VLSI semiconductor design and fabrication processes are driven by the ability to reproduce very small features with high reliability at high volume. The process is very complex, with multiple layers and hundreds of variables potentially affecting each layer. As feature sizes get smaller with each new generation, new equipment is needed and new variables become important. These new variables start at low stages of knowledge. For example, as feature sizes go below one micron, heat dissipation problems begin to push designers to engineer chips for three volts instead of five volts. This has a number of advantages but requires many changes in both chip design and fabrication. As these changes are made, the variables that were at stage six or seven for the old process “regress” to stage five; engineers know how to control them, but don’t know their effects on the new process.
- Consumer marketing has made many strides toward higher stages of knowledge in the past thirty years. Many of the breakthroughs have been based on developing effective ways to measure variables (stage three). For example, bar-code scanners at supermarket checkouts have provided masses of disaggregated data about who is buying what, whether they use coupons, etc. Some stores are now using customer ID cards to match this data with information about individual households, their demographics, what TV commercials they received, and other environmental variables to allow development of stage six and seven models of the marketing mix’s effects on consumer behavior.9
- Professional services such as legal services run the range of knowledge stages. For example, preparing a will has reached stage six or even seven for many people, so that it can be done by a $30 software program. At the other extreme, high-profile criminal trials used to be at stage three or below. Recently, a number of law firms have attempted to move jury selection to stage six, using methods such as customized polling of population groups from which a particular jury will be drawn. Other aspects of trial strategy, presumably, remain at stage three or four; they can be measured but not controlled well. For example, an important type of “input” to litigation is judicial rulings on motions. Lawyers can use the judge’s ruling to measure whether the judge agrees with them on a motion, but they have only limited control over that decision (stage four). Pretrial aspects of litigation, on the other hand, are generally better understood.
- In strategic consulting, the Boston Consulting Group’s four-quadrant matrix (cash cows, dogs, stars, and question marks) was an attempt to reduce acquisition and divestiture decisions to two quantitative variables —market share and growth rate.10 It is possible to write equations that describe the effects of market share and growth rate on business unit profit, so these two variables are at stage six. But there are many other important variables that also influence the outcome and that are at much lower stages of knowledge. Many consulting firms claim knowledge about these other variables, but they perform strategic analysis using a heavy mix of expertise, implying an awareness that some of their knowledge is at a low stage.
Dynamic Evolution of Knowledge and Performance
Important variables are those that, in fact, have great economic implications for the process. Ideally, a company would like to have a high stage of knowledge about all the important variables and a low stage about all the variables that have negligible effects. But, instead, the organization is likely to know very little about some important variables, especially for immature processes. Conversely, it may have stage six knowledge about unimportant variables, such as the color of paint on the machine and the type of clothing workers wear. Of course, in certain processes, these variables may be important, but there may be little way to know this until you learn enough to bring them to a high stage. For example, paint inside a machine may affect process chemistry, paint outside a machine may affect worker morale, and worker clothing can affect contamination-sensitive processes.
One way to visualize overall technological knowledge is as a tree (see Figure 2). The trunk of the tree is Y, the process output that we want to control. The branches from the trunk are variables that directly affect Y, (x1, x2, …). Branching off from each of these are subvariables(x1.1, x1.2, …) that collectively determine x1, and so on, to any level of detail. The shading of each branch represents the organization’s stage of knowledge, with white (invisible) representing stage one, while black is stage seven. The thickness of the branch represents its importance. Every knowledge tree trails off into a haze of dimly seen but potentially important variables and eventually becomes invisible, because there are always some variables at a still finer level of detail whose existence is unrecognized.
As the tree illustrates, a single process has many variables that are inevitably at different stages of knowledge. As more is learned about part of the process, old variables are brought to higher stages, but new variables also emerge from the mists of ignorance. The process as a whole can do no better than the knowledge about its most important drivers. If even a few key variables are at low stages of knowledge, the process can be considered at a low stage of knowledge overall.
Relationship to Theories of Organizational Learning
Experience in conducting a task generally leads to improvement, a concept formalized in the literature on learning curves.11 Most learning curve models skip the intermediate stages of causality and statistically link cumulative production directly to costs (see Figure 3, part A). But it is clear that how the production and learning processes are managed has a big impact on whether and how fast learning occurs.12 Indeed, the large amount of literature on quality improvement concerns systematic learning methods to achieve more improvement in a shorter period of time. Thus learning can be a directed activity, not just a by-product of normal production. Part B in Figure 3 shows a more complete model of technological learning, with explicit recognition of knowledge.13
It is no coincidence that the knowledge tree of Figure 2 resembles causal trees like those used in quality improvement efforts.14 These trees, also called fishbone or Ishikawa diagrams, are often used as a way of listing potential causes of problems. A process engineer may have fifty variables (or corresponding problems) at stages two through four that are potentially important. Various methods can be used to guess which ones will turn out to be the most important.15 The stages of knowledge provide a way of mapping current knowledge and estimating how hard it will be to go further on particular variables. That is, they provide a detailed scorecard for process improvement efforts.
How to Manage at Each Stage of Knowledge
The knowledge stage of different process variables is important because it determines how to manage both the knowledge and the production process. The higher the stage of knowledge, the closer the process is to “science,” and the more formally it can be managed. Conversely, low-stage processes, such as creative endeavors, do not do well under formal management methods, and should be treated more as “art.”
One of the most basic system-design decisions is the degree of procedure. There are different ways of performing a given task, requiring different kinds of people, training, and tools. At one extreme is pure procedure, i.e., a completely specified set of rules about what to do under every possible set of circumstances. At the other extreme is something we can call pure expertise or pure art — a style of action in which every situation is dealt with as if it were new and unique. This requires experienced and skilled people who use their own judgment at each moment. These people have tacit knowledge, meaning that although they can carry on a task, they are not able to explain it.
Managers can attempt to operate a process anywhere along the spectrum from pure expertise to pure procedure. The microprocessor has made it possible to execute very complex procedures at very low cost.16 But this does not mean that procedural approaches are always best. There is a natural relationship between degree of procedure and stage of knowledge (see Figure 4). For example, in order to automate a process, all key variables should be understood at least to stage six, and preferably to stage seven. If they are not, unanticipated problems will crop up frequently, and the system will not be able to deal with them effectively. Those portions of processes that are at low stages of knowledge should be done using a high degree of expertise and little automation. Locations above the diagonal in Figure 4 correspond to inexpensive but ineffective processes, which do not produce consistently good output.
Conversely, if a process or portion of a process is at a high stage of knowledge, it is inefficient to use lots of expertise to carry it out. An expertise-based process may still work (although people lose attentiveness in purely repetitive situations), but you will pay extra for experts who are not really needed. This is the area below the diagonal in Figure 4.
Why do companies find themselves off the diagonal of Figure 4? A common reason during the early 1980s was hubris: overoptimism about the firm’s knowledge of production processes and its associated ability to build, debug, and operate new factories. This led to numerous attempts to solve manufacturing competitiveness problems by automation, as exemplified by the slogan, “automate, emigrate, or evaporate.” When automation was undertaken without a solid base of process knowledge, the results were counterproductive: “The automation of a large, complex, poorly understood, conventional manufacturing process leads to a large, complex, poorly understood, unreliable, expensive, and automated manufacturing process.”17 Perhaps one of the most conspicuous and expensive examples of this syndrome was General Motors, which, in the early 1980s, invested approximately $40 billion to build a number of automated auto assembly plants, many of which never worked properly.
At the other, perhaps less common extreme are companies that use expensive labor to perform repetitive tasks, leading to inefficiency. Examples include information-based services such as manual letter sorting (e.g., the U.S. Postal Service) and routine telephone services (directory assistance). Although human judgment is very useful in these processes for handling exceptions, the bulk of the work is routine, well understood, and uses mainly the pattern recognition abilities of the human brain. Industries have taken several approaches to dealing with the resulting inefficiency, including high proceduralizing of workers, which risks dehumanizing the work and suppressing their expertise (e.g., United Parcel Service industrial engineering and automated monitoring of telephone operators), and finding ways of getting data into machine readable form so that human operators do not have to keypunch it (optical character recognition and bar coding).
The degree of procedure is not the only managerial decision affected by the stage of knowledge. Methods of organizing, methods of problem solving, learning, and training, and many other aspects of the process should also be adjusted (see Table 2).
Yet, as shown in Figure 2, most processes have important variables at widely differing stages of knowledge. The ideal management style for the process as a whole is an uncomfortable hybrid. The traditional approach to this issue was to segregate work into different functional departments, which are then managed according to their own needs. A common example of this in traditional manufacturing companies is R&D (low stages) versus manufacturing functions (high stages, or so it was believed). This Taylorist approach has broken down in modern manufacturing, especially for technologies that are evolving rapidly, because the less mature portions of the process are inevitably at low stages of knowledge.18
There are at least two other approaches to this paradox. One is to use microprocessors (or other automation) to execute procedures, but with human oversight to select the appropriate program and to recognize unprogrammed contingencies and take control. Examples include accounting, continuous manufacturing processes such as paper mills, and commercial aviation. A final approach is to use low-skilled workers to execute the better understood tasks, with experts monitoring and directing them. The low-skilled workers may be apprentices to the experts or on a separate career track. For example, law offices use both junior associates (apprentices) and paralegals.
All three approaches have weaknesses. For example, it is difficult for pilots to monitor autopilots reliably during long flights without taking an active role themselves, yet respond quickly and appropriately in emergencies.19 If lower skilled workers perform the better understood and therefore more procedural tasks, this can lead to excessive division of labor, poor coordination, and lost opportunities for learning. In addition, cultural conflict is a common result when an organization is split into sections operating at different stages of knowledge. Thus there is no ideal solution to the problem of working at multiple stages of knowledge, or if there is one, we don’t yet know it. Nonetheless, this situation is increasingly common.
A Simple Example of Knowledge Progression over Time
Knowledge increases through learning. Much learning is simply increasing the precision and accuracy of parameter estimates within a single stage, but sometimes learning shifts the knowledge to the next stage. To illustrate, using familiar technology, suppose you are baking cookies for the first time. You hope to make chocolate chip cookies, but have only a vague idea of a good recipe (raw materials) and procedure (control variables). You have a standard oven, which you were told to set at 350 degrees.20
The first step is to define your output measure, Y. It consists of a combination of taste, texture (hard or soft), and appearance.
Stage One — Complete Ignorance. You don’t even know what influences cookie characteristics, so when the results change, you consider it “random.”
Stage Two — Awareness. You rack your memory, observe others in the kitchen, and begin to build a list of possibly relevant input variables, including the list of ingredients, baking time, outdoor weather (rainy, cloudy, clear), time of day, amount and brand name of each ingredient, and a vaguely defined “mixing procedure.”
Stage Three — Learning to measure key variables. You use your watch to measure cooking time, measuring cups to measure raw materials, an outdoor thermometer and hygrometer for the weather, and a clock for the time of day. You have no detailed metric for mixing procedure, so you throw everything into one bowl and count strokes of the mixing spoon.
Stage Four — Control of the mean. You get a count-down timer and develop a procedure to take the cookies out of the oven after a set amount of time. You can control outdoor weather only crudely, by baking on days when the weather is of a particular type. You decide not to bother controlling for time of day since it does not seem to make any difference. Control of the ingredients is straightforward, using a standard measuring cup; that is, for the raw materials, stage three leads immediately to stage four.
Stage Five — Process capability and a recipe. You practice measuring ingredients until you can do it with 95 percent repeatability. You write down a set of instructions (recipe) that seems to produce “adequate” cookies. Your cookies now have a reasonably consistent taste, but texture and appearance are still variable and some cookies are burned.
Stage Six — Process characterization. You run a series of experiments on many variables, including baking time, baking temperature, mixing time, and the exact amounts of flour, sugar, and liquid ingredients. You discover the effects of a 10 percent change in each of these variables on the cookie characteristics. If a friend asks for a better baked cookie, you can now achieve it by varying either the time or the temperature. You discover that some variables, including weather and time of day, have no detectable effect on the output.
Stage Seven — Know why, including interactions among input variables. You go to the local university library and take out textbooks on baking, which give mathematical formulas for outcome variables such as sweetness and surface texture. You calibrate those models using data from your own baking process. You can now produce a “near perfect” chocolate chip cookie. If someone asks for a healthier cookie (less sugar), you can produce it, and you know how much to adjust the baking temperature. Similarly, if you are in a hurry, you know how to increase the temperature and decrease the baking time without burning the cookies.
Repeat for secondary variables. Although you now have stage five control (a recipe) for about ten variables and a stage seven understanding (know why) of five of them, there will always be a host of secondary variables in your knowledge tree that have smaller effects. And there is no guarantee that you will learn about the most important variables first. For example, you may not realize that cookie size is important (stage two) until you are well into stage five for other variables. You can subject these additional variables to the same progression through the stages of knowledge. Variables include the brand and characteristics of raw materials (butter versus margarine versus inexpensive margarine, types of flour), the importance of sifting dry ingredients together before mixing, type of baking tray (aluminum versus glass versus iron), and use of a scale instead of measuring cups for more accurate measurement of raw materials. For casual baking, you would never bother to learn about some of these variables, but if you wanted to reduce costs or improve consistency, you would have to delve much deeper into these secondary variables.
Stage Eight — Complete knowledge. Since there is an infinitude of potential secondary variables, you can never have complete knowledge of the cookie-making process.21 But for practical purposes, you can say that you have reached stage eight when you have a model that will predict output (cookie) characteristics to an accuracy of one-tenth of the tolerance band, for changes in inputs across a 2:1 range, and including all interactions.
Amateurs may stop when they have stage five knowledge about the primary variables that affect taste. They can then bake decent cookies and throw away batches ruined by low knowledge about secondary variables. But professional bakeries must track down additional secondary variables, especially those that influence costs. Here is a description of the situation at one famous baking company:
Since early this decade, Nabisco has been worried about its bakery technology, which, according to a 1981 study, had fallen far behind that of even some tiny rivals. . . . The biscuit company, to this day, uses a lot of equipment made decades ago at Nabisco’s former Evanston, Illinois, machine shop.
And to this day, baking at Nabisco remains something of an art. Oreos have uneven swaths of cream filling. The exact number of Ritz crackers in a box is anybody’s guess. Some 5 percent to 7 percent of Nabisco’s cookies and crackers emerge from its ovens broken.
Similarly, the company still has poor inspection methods for the tons of commodities it purchases, such as flour and cocoa, according to a former executive of the baking unit. The bakers must repeatedly test-bake batches of cookies and crackers to adjust ovens and other gear to slight variations in commodity composition. [In our terms, they had stage four knowledge of raw materials and were attempting to compensate for it by using stage six knowledge about how to adjust the ovens.] Such trial-and-error methods make quality control, among other things, difficult.
So, sixteen months ago, . . . the company planned to spend some $1.6 billion on complete retrofitting of four existing bakeries and close five other plants.
The plan called for a microchip revolution in Nabisco’s bakeries. At least one-third of the project’s cost was to be for the purchase of computerized weighing, mixing, packaging, and process-control equipment, says a senior Nabisco manufacturing engineer who recently resigned.
Such high-tech gear would eventually halve the company’s 8 percent “give-away” rate — the overweight amount in an average package of Nabisco biscuits — and sharply reduce its 5 percent to 7 percent breakage.22
Nabisco’s automation will be most effective with stage seven knowledge (know why) about all of the key variables. It is possible that Nabisco’s equipment vendors sell machines that already embody that knowledge, but it is likely that some of it (including the specific variables uniquely affecting Nabisco’s cookies) would have to be developed as part of the automation program.
Applying the Stages of Knowledge
Now that we have a framework for measuring and understanding technological knowledge, we can look at some principles for managing knowledge to improve production processes.
Understand How Much You Know and Don’t Know
In order to understand how much you already know about a process, you need to ask a number of questions:
- What are the important variables for the process?
- At what stages are these variables? Which variables in the process would give the most leverage if you could get them to a higher stage?
- How can you manage the process well at these stages of knowledge? What limits and opportunities does the process impose? Are your management methods consistent with knowledge levels (Figure 4 and Table 2)? How should you handle the inevitable variables that you know less about yet are still important?
- How can you learn to reach higher stages of knowledge?
You also need to beware of what you think you know about a process that you really don’t. One of the most painful forms of ignorance is false knowledge. If your company believes that it has stage six or higher knowledge about a variable, but in fact that knowledge is based on past experience and is incorrect for the present process, you will operate the process in an inferior way. A common version of this is the belief that “variable x does not matter.” It may not have mattered ten years ago because of a small contribution to process variance. But what was considered small ten years ago may be quite important today. A newer competitor, unburdened with this false knowledge, can control or change the level of x to get superior quality or lower cost.
The countermeasure for this problem is to realize that as your company’s process changes, its effective knowledge regresses to earlier stages. In particular, stage six knowledge, which is generally derived by empirical observation, often regresses to stage five for a new process. You still know how to measure and control the variable, but you no longer know its true impact.
Understand and Manage the Locations of Knowledge
Knowing where knowledge resides for the process you are managing is important for effectively managing and using that knowledge. It has implications for accessibility, transmission to new locations, and ability to extend the knowledge, among other things. Technological knowledge may be located in people’s heads, word of mouth, or other informal mechanisms; in formal procedure sheets for operators, handbooks, other written documentation; or embodied in machinery, firmware, and software. How well is it documented? How easy is it to change? How much do users know about how to use its features?
As I have discussed, the feasible and desirable locations of knowledge depend on its stage. There are also broader issues surrounding more general forms of organizational memory.23
Be Wary of Deskilling the Workforce and Freezing Processes
The Taylorist model of manufacturing, as it is commonly applied, moves technological knowledge about the process away from line workers and puts it in the heads of staff engineers. These engineers will be less available when problems come up, or they may leave the company. If workers do not understand the process, they cannot handle unanticipated situations, nor can they do much to improve the process, even if they are motivated. Therefore, one of the revolutionary effects of the total quality management movement has been to return knowledge to the workers and make them capable of doing process improvement in small groups, without relying on the traditional staff experts.
Even if you fully understand a process today, the world will change in a few years. Some of your current knowledge will be obsolete, and it will be important to reevaluate it. Once a firm assumes, for whatever reason, that it has nothing more to learn about a production process, it tends to “lock in” the present production methods by specifying rigid procedures that can deskill the workforce and cut back on product and process engineering. A firm may use time and motion studies to find the “one best way” to produce and lose interest in root cause analysis.24 While this may work well in the short run, five years from now the company may find competitors making superior products at two-thirds its cost.
For example, Jaikumar compared the development and use of flexible manufacturing systems (FMS) in the United States and in Japan.25 He found that the U.S. systems had been developed with overly ambitious goals for flexibility, up-time, labor use, etc. These goals were not achieved by the initial designs; the knowledge base was not adequate to make them possible. Yet the projects were often declared complete, and workers with much lower skills were brought in to run the FMS. The result was that the users were afraid to experiment and learn about the systems, and the systems were in fact used in a very inflexible way. In contrast, in the successful Japanese systems, the original developers stayed with the system for the first year or more of operation, and continued to improve it during that time. The result was systems that were very flexible and robust enough to run unattended.
Learn Carefully and Systematically
As we have seen, different stages of knowledge require very different methods of learning. For example, Chew and others recommend sequential use of four different methods of learning about problems that occur during the installation of new technology:
- Vicarious learning — learning from other organizations with similar situations.
- Simulation — building a model of your process and experimenting with the model.
- Prototyping — taking a subset of your process and using it for testing and refining.
- On-line learning — experimenting systematically on the full process.26
Many organizations become proficient at only one or a few methods of learning, which makes it difficult for them to deal with variables that are at different stages of knowledge.27 For example, many plants avoid the use of pilot lines and simulators to pretest process changes.
Conclusion
Lord Kelvin, in the 1890s, commented on the value of knowledge:
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind: it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science.
In terms of my framework, Kelvin was advocating the value of stage three knowledge (measure) over stage two knowledge (awareness). As I have shown, being able to measure is only the beginning; the stages of knowledge beyond stage three (control, capability, characterization, and know why) give additional power and economic value to a company’s processes. The stages-of-knowledge framework provides powerful leverage to efforts to improve processes and conveys information about how to manage. A company can make explicit decisions about which portions of the knowledge tree to pursue most vigorously.
For example, a high-volume, forty-year-old, continuous process was controlled using incremental extensions of the original sensors. These operated on a time scale from seconds to hours. A consultant recognized that the company did not have knowledge of the variables at time scales below a second. Once it learned how to measure events in the millisecond range, a large new subtree of variables became visible. By learning about these variables and their implications for the process, the process engineers were able to reduce quality problems by a factor of three within a few months. Development and exploitation of the new variables continues today.
References
1. I. Nonaka, “The Knowledge-Creating Company,”Harvard Business Review , November–December 1991, pp. 96–104.
2. Peter Drucker has commented, “In fact, knowledge is the only meaningful resource today. The traditional ‘factors of production’ have not disappeared, but they have become secondary.” See:
P.F. Drucker, Post-Capitalist Society (New York: Harper Business, 1993), p. 42.
3. Harlan Cleveland distinguishes data, information, knowledge, and wisdom. However, he then intermixes the four concepts. See:
H. Cleveland, “The Knowledge Dynamic,” The Knowledge Executive (New York: Human Valley Books, 1985).
4. R. Glazer, “Marketing in an Information-Intensive Environment: Strategic Implications of Knowledge as an Asset,”Journal of Marketing 55 (1991): 1–19.
5. R. Jaikumar, “From Filing and Fitting to Flexible Manufacturing: A Study in the Evolution of Process Control” (Boston: Harvard Business School, working paper, 1988); and
A.S. Mukherjee, “The Effective Management of Organizational Learning and Process Control” (Boston: Harvard Business School, doctoral dissertation, 1992).
6. R.E. Bohn and R. Jaikumar, “The Structure of Technological Knowledge in Manufacturing” (Boston: Harvard Business School, working paper 93–035, 1992); and
R.E. Bohn and R. Jaikumar, “The Development of Intelligent Systems for Industrial Use: An Empirical Investigation,” in Research on Technological Innovation, Management and Policy, ed. R.S. Rosenbloom (London and Greenwich, Connecticut: JAI Press, 1986), pp. 213–262.
7. This formalism is pursued in Bohn and Jaikumar (1992).
8. ∂/f∂xi in a local region.
9. Glazer (1991); and
N.R. Kleinfield, “Targeting the Grocery Shopper,” New York Times, 26 May 1991.
10. J.A. Seeger, “Reversing the Images of BCG’s Growth/Share Matrix,” Strategic Management Journal 5 (1984): 93–97.
11. J. Dutton and A. Thomas, “Treating Progress Functions as a Managerial Opportunity,” Academy of Management Review 9 (1984): 235–247.
12. P.S. Adler and K.B. Clark, “Behind the Learning Curve: A Sketch of the Learning Process,” Management Science 37 (1991): 267–281.
13. R. Jaikumar and R.E. Bohn, “A Dynamic Approach to Operations Management: An Alternative to Static Optimization,” International Journal of Production Economics 27 (1992): 265–282.
14. J.M Juran and F.M. Gryna, eds., Juran’s Quality Control Handbook (New York: McGraw-Hill, 1988), Chapter 22.
15. These methods include Pareto charts, use of analogies to similar but better understood processes, screening experiments, and other methods discussed in the quality control literature. Notice that screening experiments are possible only if the variable is already at stage four or higher.
16. G.V. Shirley and R. Jaikumar, “Turing Machines and Gutenberg Technologies: The Post-Industrial Marriage,”ASME Manufacturing Review 1 (1988): 36–43.
17. J. Flanagan, “GM Saga a Lesson for America,” Los Angeles Times, 27 October 1992, p. A1.
18. Bohn and Jaikumar (1992).
19. K.E. Weick, “Organizational Culture as a Source of High Reliability,”California Management Review, Winter 1987, pp. 112–127.
20. Experienced bakers will realize that the following account is highly simplified. A case simulation of some of the following issues is provided in:
R.E. Bohn, “Kristen’s Cookie Company (B)” (Boston: Harvard Business School, Case 9-686-015, 1986).
21. For example, eggs, flour, and chocolate are relatively complex agricultural products, of imperfect consistency over time.
22. P. Waldman, “Change of Pace: New RJR Chief Faces a Daunting Challenge at Debt-Heavy Firm,” Wall Street Journal, 14 March 1989.
23. J.P. Walsh and G.R. Ungson, “Organizational Memory,”Academy of Management Review 16 (1991): 57–91.
24. Bohn and Jaikumar (1992).
25. R. Jaikumar, “Postindustrial Manufacturing,”Harvard Business Review, November–December 1986, pp. 69–76.
26. W.B. Chew, D. Leonard-Barton, and R.E. Bohn, “Beating Murphy’s Law,”Sloan Management Review, Spring 1991, pp. 5–16.
27. Learning is obviously of central importance in knowledge-based competition, but detailed analysis is beyond the scope of this paper. A very interesting study of how machine developers become aware of new variables (stage two) through field use is provided by:
E. von Hippel and M. Tyre, “How Learning by Doing Is Done: Problem Identification in Novel Process Equipment,” Research Policy, forthcoming.
For a description of how one company manages learning as an integral part of the manufacturing process, see:
D. Leonard-Barton, “The Factory as a Learning Laboratory,” Sloan Management Review, Fall 1992, pp. 23–38.
For a discussion of the characteristics of organizations that learn successfully, see:
D.A. Garvin, “Building a Learning Organization,” Harvard Business Review, July–August 1993, pp. 78–91.
For a general typology of methods of technological learning, see:
R.E. Bohn, “Learning by Experimentation in Manufacturing” (Boston: Harvard Business School, working paper 88–001, 1987).