Determining the life-cycle of a product or service is important as the next generations should be identified in advance so that they may be planned for at design conception. Modular design, among many other practices, will help prepare the system for the incremental step improvements on that product or service S-curve. The multi-generation product plan (MGPP) benefits from technology forecasting and better prepares the organization for the strategic and tactical resources required to improve that particular portfolio item. A sytematic method of predicting system evolution is present in the TRIZ methodology. Maturity Mapping and the Patterns of Evolution (each pattern is decomposed into lines of evolution- these are more detailed directions within each pattern) are a very powerful combination that may be utilized to identify the next generations or versions of a product or service on its’ life-cycle curve and then incorporated into the MGPP.
There are eight main patterns of evolution from the TRIZ methodology.
Systems evolve according to specific principles that have been derived by decomposing the historical progression of many products and services. One such principle is the Mono-Bi-Poly pattern of evolution (I also call it the Reduction Principle). The Reduction Principle may be used to identify the next generations of a particular system. This principle states that a system provides a function or a set of functions to a customer. The type and number of functions may be modulated over time to improve the product or service.
Within the TRIZ world, another way of looking at this pattern of evolution (increasing complexity, then simplicity) is in terms of hybridization. As a system gets more functionality and gets more complex, it splinters into more and different parts. But over time, the added functionality collapses or hybridizes back into a simpler design.
You see this happen in manufacturing repeatedly as systems add part counts during the period of increasing functionality, then reduce part counts as designs are simplified to provide the same functionality.
Take a pencil for example. In the beginning it was just a piece of wood with a length of black lead. In the language of a TRIZ practitioner who understands the dynamic progression of hybridization, this simple writing system is a “homogeneous mono-system.” Then when someone added an eraser to the pencil, it transformed from a homogeneous mono-system to a “heterogeneous bi-system.” That is, the pencil performed two different functions, writing and erasing, within the same system.
After more time, mechanical pencils could incorporate different colors of lead so you could write in black, red, green, and blue — using the same instrument. In TRIZ language, you can say that the pencil with the eraser became a heterogeneous poly-system. It performs more than one function by adding more parts and complexity.
But now you have a multi-colored lead pencil that collapses the colors back into one length of lead which, depending on the angle it is held, will write different colors. With this advancement, you now have a new heterogeneous mono-system that writes all the different colors with nearly the simplicity of writing with one color, while providing the eraser, of course.
So with the pencil, you see a very simple illustration of how systems evolve toward increasing complexity and added functionality, and then toward increasing simplicity with no erosion of function. The evolutionary principle of hybridization is universal — meaning it has been validated over and over again. Naturally, in the drive to be better, you try to take the best of one system or technology and mix it with the best of another to get the best of both. This is the perennial drive of the fittest, and it won’t be denied.
Another example of hybridization is the work of biologists who engineer the best properties of one system into the best properties of another, while simultaneously canceling the drawbacks of each as they relate to the objective at hand. An old but good example is the cross-breeding of two different plant seeds, one that survives in dry climates, the other that survives in cold places. Although neither could live in a cold, dry place, the hybrid can.
So because of intentional hybridization, the plant is more robust to temperature and moisture, and you can use this principle to explain what happened next in the progression of quality and TQM. After the practice of SQC became solidified as a viable means for improving and controlling manufacturing processes, why couldn’t it be applied to other processes as well — in procurement, administrative departments, distribution, and marketing?
What can’t the seed grow in more than one climate? Why can’t the pencil write in blue and green and not just black? Why can’t the principles and practices of quality control be applied outside of manufacturing?
By the force of evolution, SQC expanded and diversified into all departments and functions of an organization, and the Japanese drove it there until Armand Feigenbaum coined the term Total Quality Control, or TQC. The homogenous mono-system (SQC) became a homogeneous poly–system (TQC) as it expanded the function of quality improvement and control to everything an organization does.
Not incidentally, around this same time when the industrial economy was growing rapidly across the globe, others were working to develop so many other aspects of what you now take for granted as the underpinnings of business success.
In Japan in the 1950s, the forefathers of Lean Manufacturing were pioneering the methods of flow, waste reduction, inventory control, and operational speed. In Russia, a team of engineers were developing the empirical basis for product, process, and organizational innovation. Also in Japan, others were developing Hoshin Kanri methods, which quantitatively connect the functions and processes of an organization around strategic priorities.
Fast forward to the 1970s, 80s, and 90s, when certain families of management tools collapsed into themselves, forming simpler and more integrated versions of formerly fragmented systems. By tracing the development of TQM, you arrive at a point in the United States when all the tools of SQC became packaged together for ease of deployment and application into a set of standards known as the Baldridge criteria, and this was really the defining time when the system known as TQM became a big tool itself, a homogeneous poly-system that reduced defects and variation and improved the quality of products and services focused on customer needs.
Still later, the components of TQM were dovetailed with other key systems and practices, such as David Norton and Robert Kaplan’s Balanced Scorecard. After all, what good is quality improvement if you can’t trace its impact? It was around this time that Motorola began driving hard with a methodology called Six Sigma, which had its beginnings as a big hammer for pounding the nails of product quality to the point of no more than 3.4 million defects per million opportunities for defects at the quality characteristic level.
After some evolutionary momentum, Six Sigma extended its data-driven reach to focus on creating significant financial return, first in the form of cost reductions born of process improvements, and later in the form of growth by its application in sales and marketing. In addition, Six Sigma injected the agenda of quality into the top executive level of corporations, materializing TQM’s former lip service to top management involvement.
With a connected system of performance metrics, hard accountability at the executive level, top- and bottom-line impact and large-scale deployment, Six Sigma achieved the dream of TQM and became the world-class mono system for performance capability.
But for all this, Six Sigma is still just an extension of the quality movement, with new functionality first made more complex but now made simpler and more commoditized through programmed deployment designs, e-learning, and other software aides and technologies. The evolutionary trend of “complex to simple” was augmented with the evolutionary trend of “decreased human involvement,” another TRIZ tenet, and you arrive today at a place where Six Sigma can be implemented in an organization with less effort and greater return on investment than ever before.
 David Silverstein, Neil DeCarlo, Michael S. Slocum, Insourcing Innovation, Taylor & Francis, to be published November 2007.
December 19, 2008
Innovation has always been thought of as the creation of an idea or concept that is novel. It has been the pursuit of a new way to perform a required function. The search for what was previously undiscovered has been the goal of ideation activities for most as far back as can be remembered. The search for the undiscovered has fueled ingenuity and has been the driver but that perspective is changing. The way people innovate is changing as well as the field evolves. Innovation has historically been practiced as an art- meaning smart people applying black-box thought to generate solutions to problems. There was no recognizable or algorithmic approach to the field and the concept of innate innovativeness was promulgated. The Theory of Innovative Scarcity was formed based on this erroneous presumption that you were either born innovative or you weren’t. It is understood currently that the practice of ideation can be reduced to a set of principles that may embody sets of algorithms that describe the innovation function. Scientific components are being applied to the field and they complement the existing artistic components. This is yielding repeatability, predictability, and reliability to the field were fiat and eureka moments ruled the day (or didn’t). The reduction of innovation to an exact science (or at least progression in that direction) has produced methods that are trainable and this has been validated across many organizations. The innovative capability of each person who is exposed to these methods is being increased. This universal increase in the organization’s innovation quotient has profound implications and is the father of the Theory of Innovation Plenty. The impact that innovation has on the world has changed dramatically as well. The impact of innovation is now global and capable of creating a multitude of sub-innovations. The discovery of scientific phenomenon has been the driving force for the creation of entire fields of commerce and innovation. The internet for example has spawned countless international pockets of innovation and invention. The field of e-commerce and all the innovations created to match sellers and buyers is one example. Advances in transaction security are another set of inventions derived since the advent of the internet. The pace of change has increased as well. Societal (VOS) and Customer (VOC) needs are changing faster than ever before in history and this demands that businesses respond faster as well. The increase in the rate of change in societal needs and the changes in the field of innovation force us to look at discovery in a new light.
Some have made the observation that all that can be invented has been invented. This is not my position although you could argue that the number of high level inventions has diminished over time with the overwhelming majority of patents issued being on a significantly lower innovative level. Johann von Goethe has stated that innovation has changed such that the solution to you r problem may already exist and therefore your task as the problem solver is to find those pre-existing solutions and ADAPT them to suit your purpose(s). This makes for a different approach to the search for an innovative solution. It isn’t novelty that is pursued but analogy. That is why the discussions about non-linear problem solving are so important. The non-linear approach takes the problem solver from the specific to the generic and this generalization is the key to adaptive innovation. Adaptive innovation is the key to the ability of an organization’s ability to respond quickly to the needs of society and the customer. I am called this approach to innovation that leverages any ability to adapt and existing solution to suit for purpose Adaptivation. Adaptivation should be pursued as par t of any problem solving activity. This is due to the work that has been achieved in the field where patents and invention disclosure have been analyzed in order to catalogue the efforts of previous problem solvers. These efforts are made available to the problem solver primarily through the field of TRIZ (Theory of Inventive Problem Solving). TRIZ is much more than an Adaptivation methodology however this aspect of TRIZ will be the focus of this discussion.
TRIZ as Adaptive
A fundamental aspect of the TRIZ methodology is the analysis of previous problem solving records. These records are invention disclosure, patent records, and historical observations classified as heuristics. Observations were made concerning the specific problem and the specific solution. The observations were coupled with analysis that affinitized specific problems and specific solutions into abstract groups. These abstract groups are useable when your specific problem is converted to an abstract problem that is analogous to one of the abstract problems from the researched group. A matrix then provides the appropriate solution concepts from the solution group. This empirical approach allows the problem solver to practice systematic adaptation. This reduces innovation process cycle-time and prevents the generation of previously discovered solutions thereby eliminating the waste of duplication from innovation. This increases the efficiency and the effectiveness of the process. This provides improvements in ideation that make the organization’s ability to respond on the same scale the rate of evolution for the societal and customer needs. This process doesn’t preclude the search for the novel but is used in conjunction with it. This non-linear approach is both convergent and divergent and therefore appeals to the different problem solving styles present in the organization. Adaptivation will allow the organization to increase performance across several problem solving metrics and integrate the brilliant work of past problem solvers into your organization. The process is relatively easy to learn and competency can be established in a few months.
Adaptivation needs to become a key competency in the Preservation strategy of an organization. It should also be the first step in the Evolution strategy as well. This allows for an ambidextrous application of adaptivation across the organization. Also, streamlining the demands placed on limited resources is a step in the right direction as part of the Third Wave (the systematization of the field of innovation). Doing more with the resources you have is an ideal solution and leveraging the pre-existing work of others expands the corporate intellect. This is a robust method of expanding the search space during problem solving. The expansion of the search space integrates another key component necessary in today’s evolving competitive environment: Open Innovation.
Adaptivation as an Open Approach
Adaptivation provides insight into solutions to analogous problems from industries, technologies, and even scientific fields that may not traditionally overlap with yours. This is an open approach to solution generation and it provides additional benefits to the problem solver. Not only are pre-existing approaches considered but the search space for these solutions is considerably larger than the organization’s typical search space. Typically the organization would look inwards at existing patents, competitive intelligence, or for team members to create the solution to the problem based on previous experience. Collectively this describes the closed approach to innovation whose narrow-minded focus is no longer an acceptable ideation approach given the evolving natures of competition and innovation.