Determining the life-cycle of a product or service is important as the next generations should be identified in advance so that they may be planned for at design conception. Modular design, among many other practices, will help prepare the system for the incremental step improvements on that product or service S-curve. The multi-generation product plan (MGPP) benefits from technology forecasting and better prepares the organization for the strategic and tactical resources required to improve that particular portfolio item. A sytematic method of predicting system evolution is present in the TRIZ methodology. Maturity Mapping and the Patterns of Evolution (each pattern is decomposed into lines of evolution- these are more detailed directions within each pattern) are a very powerful combination that may be utilized to identify the next generations or versions of a product or service on its’ life-cycle curve and then incorporated into the MGPP.
There are eight main patterns of evolution from the TRIZ methodology.
Systems evolve according to specific principles that have been derived by decomposing the historical progression of many products and services. One such principle is the Mono-Bi-Poly pattern of evolution (I also call it the Reduction Principle). The Reduction Principle may be used to identify the next generations of a particular system. This principle states that a system provides a function or a set of functions to a customer. The type and number of functions may be modulated over time to improve the product or service.
Within the TRIZ world, another way of looking at this pattern of evolution (increasing complexity, then simplicity) is in terms of hybridization. As a system gets more functionality and gets more complex, it splinters into more and different parts. But over time, the added functionality collapses or hybridizes back into a simpler design.
You see this happen in manufacturing repeatedly as systems add part counts during the period of increasing functionality, then reduce part counts as designs are simplified to provide the same functionality.
Take a pencil for example. In the beginning it was just a piece of wood with a length of black lead. In the language of a TRIZ practitioner who understands the dynamic progression of hybridization, this simple writing system is a “homogeneous mono-system.” Then when someone added an eraser to the pencil, it transformed from a homogeneous mono-system to a “heterogeneous bi-system.” That is, the pencil performed two different functions, writing and erasing, within the same system.
After more time, mechanical pencils could incorporate different colors of lead so you could write in black, red, green, and blue — using the same instrument. In TRIZ language, you can say that the pencil with the eraser became a heterogeneous poly-system. It performs more than one function by adding more parts and complexity.
But now you have a multi-colored lead pencil that collapses the colors back into one length of lead which, depending on the angle it is held, will write different colors. With this advancement, you now have a new heterogeneous mono-system that writes all the different colors with nearly the simplicity of writing with one color, while providing the eraser, of course.
So with the pencil, you see a very simple illustration of how systems evolve toward increasing complexity and added functionality, and then toward increasing simplicity with no erosion of function. The evolutionary principle of hybridization is universal — meaning it has been validated over and over again. Naturally, in the drive to be better, you try to take the best of one system or technology and mix it with the best of another to get the best of both. This is the perennial drive of the fittest, and it won’t be denied.
Another example of hybridization is the work of biologists who engineer the best properties of one system into the best properties of another, while simultaneously canceling the drawbacks of each as they relate to the objective at hand. An old but good example is the cross-breeding of two different plant seeds, one that survives in dry climates, the other that survives in cold places. Although neither could live in a cold, dry place, the hybrid can.
So because of intentional hybridization, the plant is more robust to temperature and moisture, and you can use this principle to explain what happened next in the progression of quality and TQM. After the practice of SQC became solidified as a viable means for improving and controlling manufacturing processes, why couldn’t it be applied to other processes as well — in procurement, administrative departments, distribution, and marketing?
What can’t the seed grow in more than one climate? Why can’t the pencil write in blue and green and not just black? Why can’t the principles and practices of quality control be applied outside of manufacturing?
By the force of evolution, SQC expanded and diversified into all departments and functions of an organization, and the Japanese drove it there until Armand Feigenbaum coined the term Total Quality Control, or TQC. The homogenous mono-system (SQC) became a homogeneous poly–system (TQC) as it expanded the function of quality improvement and control to everything an organization does.
Not incidentally, around this same time when the industrial economy was growing rapidly across the globe, others were working to develop so many other aspects of what you now take for granted as the underpinnings of business success.
In Japan in the 1950s, the forefathers of Lean Manufacturing were pioneering the methods of flow, waste reduction, inventory control, and operational speed. In Russia, a team of engineers were developing the empirical basis for product, process, and organizational innovation. Also in Japan, others were developing Hoshin Kanri methods, which quantitatively connect the functions and processes of an organization around strategic priorities.
Fast forward to the 1970s, 80s, and 90s, when certain families of management tools collapsed into themselves, forming simpler and more integrated versions of formerly fragmented systems. By tracing the development of TQM, you arrive at a point in the United States when all the tools of SQC became packaged together for ease of deployment and application into a set of standards known as the Baldridge criteria, and this was really the defining time when the system known as TQM became a big tool itself, a homogeneous poly-system that reduced defects and variation and improved the quality of products and services focused on customer needs.
Still later, the components of TQM were dovetailed with other key systems and practices, such as David Norton and Robert Kaplan’s Balanced Scorecard. After all, what good is quality improvement if you can’t trace its impact? It was around this time that Motorola began driving hard with a methodology called Six Sigma, which had its beginnings as a big hammer for pounding the nails of product quality to the point of no more than 3.4 million defects per million opportunities for defects at the quality characteristic level.
After some evolutionary momentum, Six Sigma extended its data-driven reach to focus on creating significant financial return, first in the form of cost reductions born of process improvements, and later in the form of growth by its application in sales and marketing. In addition, Six Sigma injected the agenda of quality into the top executive level of corporations, materializing TQM’s former lip service to top management involvement.
With a connected system of performance metrics, hard accountability at the executive level, top- and bottom-line impact and large-scale deployment, Six Sigma achieved the dream of TQM and became the world-class mono system for performance capability.
But for all this, Six Sigma is still just an extension of the quality movement, with new functionality first made more complex but now made simpler and more commoditized through programmed deployment designs, e-learning, and other software aides and technologies. The evolutionary trend of “complex to simple” was augmented with the evolutionary trend of “decreased human involvement,” another TRIZ tenet, and you arrive today at a place where Six Sigma can be implemented in an organization with less effort and greater return on investment than ever before.
 David Silverstein, Neil DeCarlo, Michael S. Slocum, Insourcing Innovation, Taylor & Francis, to be published November 2007.
Archeologists sometimes find artifacts that are not consistent with the collection of technologies discovered for a certain period and/or culture. These finds are termed “out-of-place artifacts” (OOPARTS). The pyramids are often cited as a fantastic example of this phenomenon. Also classified as OOPARTS are the voltaic cells found in Babylon, the electron tubes found in Dendera, the Ashoka Pillar found in Delhi, India, and the navigational calculator found in Antikythera. These items demonstrate capabilities thought to have been advanced from where the culture was technologically and, therefore, not possible. They demonstrate a discontinuity in our understanding of a particular capability set.
Continuous innovation is the logical progression of what is. It may be the addition of features or functions based on an existing platform. Typically the development and manufacture of these derivative devices is only slightly more complex than the previous generation. These types of innovations are great for preserving or extending the profitability period of a portfolio. As the portfolio matures, the need for a discontinuity increases. The organization must innovate in new areas that are unrelated to previous work. The type of creativity that generates great derivative products is not suited for the generation of a discontinuity. Strategic Systematic Innovation (SSI) is required. SSI is the set of capabilities that will empower an organization to produce a concept that is discontinuous from all others. While leveraging core competencies, a novel portfolio can be created. This discontinuous portfolio creates a new profitability curve and minimizes the risk associated with a limited and mature portfolio. The discontinuity is the way to evolve your organization.
Without a systematic approach for the intentional creation of discontinuities (or disruptive innovations), the organization is left to the mercy of fiat and chance. This is not an envious position to be in. You want your innovation to be a regular and habitual product of the systematic application of principles, methods, and discipline (just like your quality and production capabilities). You don’t want innovation at your organization to be classified as an OOPART by the business archeologists of tomorrow, do you?