Document Sample
tm Powered By Docstoc
					     Taguchi Methods
     Achieving Quality

             Project 1
ENGR 801 - Engineering Management
   San Francisco State University
       School of Engineering

           Submitted by:
         Mike Eiklenborg
          Stavros Ioannou
          Gregory King II
          Mark Vilcheck

       The term Taguchi Methods refers to a collection of principles which make up the
framework of a continually evolving approach to quality. This system of quality
engineering takes its name (at least in the United States) from Genichi Taguchi, who
along with Deming, Juran and Ishikawa, is considered a pioneer of the modern quality
       In order to gain a fuller understanding of Taguchi’s philosophy it is beneficial to
examine its roots and the conditions which led to its developments and also to look
closely at what is meant by “quality”.
       In the 1940’s and 1950’s W. Edwards Deming, often referred to as “the father of
the modern quality movement”, proposed an innovative approach to quality management.
His approach, including statistical measures, stressed the importance of the “voice of the
customer”, winning the confidence of co-workers, reduction of variation, and continual
improvement in terms of manufacturing process and product. Deming’s approach was
enthusiastically studied and applied in Japan, where in 1951, the Japanese Union of
Scientists and Engineers named their prestigious quality award the “Deming Prize”. In
the U.S., however, Deming’s theories were for the most part ignored. This fact was to
become very significant for manufacturing in later years.
       American manufacturers ruled over U.S. markets in monopolistic fashion until
roughly 1970. During the 1950’s and 1960’s, companies were concerned mainly with
profit in the short term. Selection of suppliers was based entirely on reducing cost. In this
high quality and low cost were not compatible concepts. Upper-level management was
increasingly adversarial with all worker levels and companies were isolated from
customers, as evidence by the dealer networks developed by automobile manufacturers to
handle sales and service. As a consequence of these developments, American
manufacturers suffered substantial losses in domestic and worldwide market share in
automobiles and such areas as profitable consumer electronics. This same period of time,
however, saw Japan make major gains in the areas lost by U.S. manufacturers. The
Japanese stressed to the importance of customer opinion and focused on increased
communication between management, workers, vendors, and consumers.

        It was this competitive crisis in manufacturing during the 1970’s and 1980’s that
gave rise to the modern quality movement, leading to the introduction of Taguchi
methods to the U.S. in the 1980’s. While Deming’s approach deals with management and
Taguchi’s is a system of design engineering, the two philosophies share a common goal:
to increase quality.
        It was mentioned before that Taguchi’s philosophy is continually evolving. The
ever-changing nature of the Taguchi methods is a natural and necessary extension of the
concept called “Kaizen” by the Japanese. Simply defined, Kaizen means improvement,
but it is more than that. It means ongoing improvement, collectively involving managers
and workers. Taguchi methods seek to improve quality and, in light of Kaizen, are
themselves subject to continual change and improvement.
        This brings us to a question that is central to a discussion of Taguchi Methods.
What is meant when we say “quality”? Quality can be defined many ways. For instance,
one simple way to define it is by customer satisfaction. Consumers provide a gauge of a
product’s quality through their wallets. Another way to define a product’s quality is
through its performance when “rapped, overloaded, dropped, or splashed”. Put another
way, quality is a product’s (or design’s) ability to cope with variation and conditions of
use in the customer’s hands. This will be discussed in more detail later. Perhaps one of
the best ways to define a product’s quality is by the product’s “fitness for use”, as stated
by Juran. For purposes of Taguchi Methods, quality (or lack thereof) is determined in
relation to a loss suffered by society due to a product’s failure.
        Taguchi Methods of Quality Engineering design are built around three integral
elements: the loss function, signal-to-noise ratio, and orthogonal arrays, which are each
closely, related to the definition of quality.


       As stated above, Taguchi defines quality in terms of a loss to society caused by a
product failure. For instance, such a loss might be a loss of product function, time,
property, a negative environmental effect, or more significantly, a financial loss.
       A product that begins to bring losses from the production stage and does so at
later stages (in consumer’s hands) can be called a poorly designed product. Actions are
required to improve this product’s functionality in order to be considered a good design.
The various types of losses caused by such products can be summarized in the following
two categories:
       1. Loss incurred through negative effects on society, for example water pollution.
       2. Loss caused by the variable performance of the product.
       The second category has significance for our discussion. This loss to society is
quantified by the Taguchi loss function. The loss function is an attempt at reconciling
customer demands with specific targets that designers/manufacturers can shoot for. The
loss function takes the form of the following quadratic equation:


where L = loss, D = (y-m), y = particular characteristic, m = nominal characteristic, and
C is a constant for the process (cost of rework, for example).
       For example, let us consider a gun manufacturer whose guns are being produced
with a misaligned extractor. It costs $10 to rectify the manufacturing error. The quality
loss function says that the manufacturer will end up paying the cost for alignment times
the square of standard deviation from the target alignment. For example for three
standard deviations from the target it will cost ($10)*(3)^2=$90.
       In the article, “Robust Quality”, Taguchi and Clausing present a very good
example. It involves the performance of Sony televisions manufactured at two different
locations, San Diego and Tokyo. The Sony engineers noticed that customers prefer
pictures with a particular color density, for example 10. Sony then set specification limits
to 10 + 3. The products coming out of San Diego were in the density range of 9.2-12.6

while most of the products from Tokyo where on target and 0.3% were off the limits.
This was explained because the Japanese manufacturer was setting the tolerance as close
to zero as possible while Americans were trying just to get it within the limits of
specification. The result of this example is that the TV’s built in San Diego would be sent
back as defective more easily than the ones from Tokyo because of the big difference in
density range although they were within specification limits. It would cost more to repair
the percentage of TVs sent back as defective than to exchange the 0.3% from Tokyo that
were out of the required range.
       A product can be considered successful if it is of good quality, but we rarely have
successful products that have the same exact quality. In other words products can be of
slightly different quality but still perform well. Taguchi includes this variation within the
upper and lower acceptable limits (UAL and LAL respectively). All the products falling
within this region are functionally acceptable and they are not expected to bring any loss
to the society. If a product falls outside these limits, there will be a loss to the society.
This product will then need to be discarded or repaired. The goal now is to control the
manufacturing process in such a way that the products fall within the LAL and UAL,
minimizing losses.
       Taguchi’s target is customer satisfaction by developing products which meet the
target value on a consistent basis. Thus, the important message from this philosophy is
that the variation around the target value should be minimized. In other words quality is
best achieved by minimizing the deviation from the target, not a failure to confirm to


           The signal-to-noise concept is closely related to the robustness of a product
design. Robustness has to do with a product’s ability to cope with variation and is based
on the idea that quality is a function of good design. A robust design or product delivers a
strong “signal”. It performs its expected function and can cope with variations (“noise”),
both internal and external.
           Since a good manufacturing process will be faithful to a product design,
robustness must be designed into a product before manufacturing begins. According to
Taguchi, if a product is designed to avoid failure in the field, then factory defects will be
simultaneously reduced. This is one aspect of Taguchi Methods that is often
misunderstood. There is no attempt to reduce variation, which is assumed to be
inevitable, but there is a definite focus on reducing the effect of variation. “Noise” in
processes will exist, but the effect can be minimized by designing a strong “signal” into a
           This is antithetical to “Zero Defects” policy which has been prevalent in
American manufacturing. Under Zero Defects, strict on-line controls are imposed on
manufacturing processes in order to minimize losses in the factory. The idea is, an effort
to minimize process failure in a factory will lead to minimization of product failure in the
field. Quality losses are seen in terms of costs incurred in the factory due to products that
cannot be shipped, costs of rework, etc. A product whose components exhibit wide
variations within spec and is shipped, but then fails to perform its function properly under
varied conditions in the field is not considered a loss. For Taguchi, such a product would
be loss.
           The dimensionless signal-to-noise ratio is used to measure controllable factors
that can have such a negative effect on the performance of a design. It allows for the
convenient adjustment of these factors. Provided that a process is consistent, adjustments
can be conveniently made using the signal-to-noise ratio to achieve the desired target.


       Given that a maximized signal to noise ratio is crucial, how do companies go
about this. Most world class companies follow a three step process.
       1. They define and specify the objective selecting or developing the most
           appropriate signal and estimating the concomitant noise.
       2. They define feasible options for the critical design values, such as dimensions
           and electrical characteristics.
       3. they select the option that provides the greatest robustness or the greatest
           signal to noise ratio.
Sounds simple, right? It really isn’t. It has been said that in order to optimize the
steering mechanism of a car using this method a set of 13 design variables must be
analyzed. If you used the conventional method of comparing each set variables to each
other, you would have to make 1,594,323 experimental iterations to observe every
possible combination. Clearly this is not acceptable in today’s market place. What then
can be done to reduce the total number of iterations necessary. Sir Ronald Fisher
developed the solution: Orthogonal Arrays. “The orthogonal array can be thought of as a
distillation mechanism through which the engineers experiment passes.” (Ealey, 1988)
The array allows the engineer to vary multiple variables at one time and obtain the effects
which that set of variables has on the average and the dispersion. From this the engineer
can track large numbers of variables and determine:
       1. The contribution of individual quality influencing factors in the product
           design stage.
       2. Gain the best, or optimum condition for a process, or a product, so that good
           quality characteristics can be sustained.
       3. Approximate the response of the product design parameters under the
           optimum conditions.
       The benefits being abundantly clear, the next question that comes should come to
mind is “How do I use this powerful tool?” While it is not possible to cover OA’s in too
much detail in this paper, the key points for constructing an OA can be identified. First
and foremost, one must remember what the main objective is: to determine the optimal

condition of a system of variables. The procedure for using OA’s can be broken down
into seven main steps. These are as follows.

        The experimenter needs to first determine what the primary role of the system is.
This can be cooling air from 95C to 10C, accelerating a car from 0 to 60 mph, producing
high speed IC chips, or allowing beam deflection of no more than a tenth of an inch.
Each of these may have several parameters within which they must operate. For instance,
cost, size, weight, speed, etc.

        Once the main factors have been established, the noise factors must be
determined. Noise factors are uncontrollable due to their nature or the cost of controlling.
Obviously a refrigeration system would operate well if the environmental temperature did
not exceed 50C or so. However, maintaining a home, or industrial setting is ver costly
and is not ideal for working conditions of employees. Some likely noise factors are
external vibration, cost, temperature, environmental conditions, material quality and
manufacturing quality.

        There generally a few factors which are to be optimized such as footprint size,
cost, efficiency etc. Each of these must be clearly identified and an objective function
established. Once the function is established the objective is to optimize this function.
Keep in mind that the engineer is generally not concerned with the specific values yielded
by each experiment, but rather in distilling the effects of each which each of the various
settings has on the system as a whole.

        For each factor two or three levels or settings may need to be observed. For
instance a slightly rich and slightly lean fuel to air ratio in an automobile engine, or min
or max input voltage in an IC circuit, or even variation in soil condition for the placement

of a foundation. It is important to identify at least the high an low values taking into
consideration the noise, and to have as few levels as possible.

           Having determined the levels for the control factors, the proper OA for use must
be determined for both the main factors and the noise factors.           OA’s are identified
according to the number of configurations and levels which can be accommodated.
Table I identifies the common OA’s with their factors and levels and the equivalent
number of individual experiments.

Orthogonal Array               Factors and Levels                   No. of Experiments
L4                             3 Factors at 2 levels                8
L8                             7 Factors at 2 levels                128
L9                             4 Factors at 3 levels                81
L16                            15 Factors at 2 levels               32,768
L27                            13 Factors at 3 levels               1,594,323
L64                            21 Factors at 4 levels               4.4 X 1012

TABLE I: Common Orthogonal Arrays With Number Of Equivalent Full Factorial
Experiments Given In The Right Column.

           The noise and the control array can then be combined to form a simulation
algorithm which allows the experimenter to study the control factors against the noise

           Now the actual experiment must be conducted. While it is possible to conduct
actual physical experiments, this is often very costly. Hence, many manufactures opt to
use mathematical models which closely approximate the system parameter. In this way a
controlled matrix experiment can be conducted with little cost.

       Once all of the data has been collected an analysis of the mean (ANOM) or
analysis of variables (ANOVA) can be used to determine the optimal signal to noise ratio
and thus the optimized design parameters for the system.


       The central idea behind Taguchi’s approach to quality engineering design is that
variations in a product’s performance can result in poor quality and monetary losses
during the life span of the product.       These variations can be classified as either
controllable parameters or uncontrollable (noise) parameters. Controllable parameters
are parameters that can be specified and modified by the designer while noise consists
mainly of environmental factors and natural laws.
       The distinction between these types of parameters has been and always will be
with us, although as technology increases some noise factors will become controllable. A
good example of this, as well as a distinction between the two types, can be seen with a
hypothetical example of the invention of the wheel. The wheel began as a square causing
a terrible ride on the old carriages. After many complaints, an engineer began analyzing
the problem. The engineer realized that the ride discomfort was caused by the variation
in distance between the axle and the earth when the square wheel was on an edge and
when it was on a flat surface. This distance is shown in figure 1 as h.

                            Figure 1. Variation of axle height

The engineer deduced that h is inversely proportional to the number of sides, n, of the
wheel. Thus,
                                     h  1 – cos(/n).
The engineer now realized that n is a controllable parameter. As he increased n, h
decreased causing a smoother ride. He now ran into two noise factors, or uncontrollable

parameters. The first was that technology only allowed for straight cuts in his era. He
was not able to make an infinite amount of cuts and therefore could not minimize h by
making the wheel round. The second noise factor was the fact that he could not control
the contour of the land that the riders chose to commute on. Eventually he, or another
engineer, realized that he can achieve an infinite amount of sides with only two cuts. He
could cut the wheel out of a tree.
       Taguchi’s approach can be broken down into a few different steps. These steps
include problem formulation, experimental planning, experimental results and
confirmation of the improvement. This is essentially a closed loop process as shown in
figure 2. If the objective is not met, the procedure must begin again with modified

                        Figure 2. Design Process Block Diagram

Problem Formulation
       This step involves clearly defining the problem by stating the problem objectives
and parameters. First an overall system must be designed. System design consists of
brainstorming as many different systems as possible that could achieve the problem
objectives. Possible systems for the above example could be modifying the square wheel

as proposed or eliminating the wheel all together by smoothing the base of the carriage to
minimize friction. h could also be minimized by simply reducing the size of the wheel but
this would cause the wheel to become stuck more easily. The original design is clearly
the best solution but other systems must be considered regardless of how strange they
may seem.
       The parameters must now be defined for the system. These include controllable
parameters, noise and tolerance parameters.       A few of the controllable and noise
parameters were discussed above for the wheel problem. Tolerance parameters may be
defined by the roundness of the wheel, or h  t where t is a specified tolerance. In
addition to the previously prescribed parameters, cost must also be considered. Cost may
put an upper or lower limit on the given parameters. In this case, the cost per straight cut
may be an issue. If the manufacturer of the improved wheel can only make eight cuts per
wheel without losing money, the world must live with an octagonal wheel. Although this
design is not optimized, it is an improved design over the square wheel.

Experimental Planning
       Experimental planning involves designing and carrying out the experiment. The
experiment can be based on the loss function or a matrix experiment using orthogonal
arrays as discussed earlier. For the example of the wheel, a loss function experiment
might be employed. Using Taguchi’s Loss function,
                                      L(y) = K(y-m)2
m would be set to zero, the goal for h, and h would be substituted for y. In this situation
K is insignificant and can be set to unity. As stated above, h  1-cos(/n) so the loss
function now becomes
                                   L(h) = (1-cos(/n))2.
Carrying out the experiment would now require measuring L(h) for various values of n.
The data is now ready for analysis.

Experimental Results
       Analysis of the experimental results will determine the effects of various
parameters on the product quality and can help predict the parameter requirement for

product optimization.     Use of analysis of mean (ANOM) and analysis of variance
(ANOVA) is helpful in this step.
        For the ongoing wheel problem, this step would involve minimizing L(h) on the
variable n. It’s obvious that n =  will minimize the loss function but technology didn’t
allow this for the specified problem at the time. The loss function will also be minimized
twice between n=0 and n=1, but for this system 3  n  f where f is some maximum
number of cuts. A cost parameter above also limited the number of sides to eight, so for
this example L = 5.8x10-3 for n=8 before the idea of using a tree trunk. Figure 3 shows a
plot of loss function L vs. n for the wheel problem.

                                           L vs. n



                               0       2           4         6           8

                         Figure 3. Loss function for wheel problem

Confirmation of Improvement
        In this stage the new design must be shown to meet the design criteria or be
optimized to the given limitations and parameters. If the new design is optimal on the
specified limitations it will be adopted as the new design. If the new design does not
meet the specified criteria the process must be reiterated using new systems until the
criteria are met.


Following the lead of their Japanese counterparts, the U.S. has only recently begun to
adapt the Taguchi Method to US manufacturing methods. In true American the Taguchi
method is used under the guise of Total Quality Control.           With its basis in the
“competitive manufacturing crisis of the 1970s and 1980s” between the US and Japan the
US is enveloped in the “modern quality movement” . Turning from their “Company
knows what’s best for the customer” attitude GM, Ford, Chrysler, and American Motors
have led the current manufacturing revolution.         Manufacturers are scrapping the
hierarchical approach to management in exchange for a closely networked team of
laborers, managers, engineers, and sales staff. This highly versatile, interlaced working
community is more adept at focusing on high quality with low cost.             These two
components are essential if the U.S. is going to continue to win the market shares of
cameras, televisions, automobiles, computers, and microelectronics.

The application of the Taguchi method to the automobile industry has brought about a
dramatic change. The prior mind state was blatant disregard for design defects until the
final product was developed. In fact, most engineering teams worked independent of the
other without any cross talk occurring until attempts to put the final design together were
dismally unsuccessful. Only then would engineers and designers collaborate together in
a less than whole hearted fashion to identify, research, and correct design flaws. Despite
the many times with this scenario was repeated again and again, US manufacturers
continued to attempt to fit the final product into the design specifications. Following the
new thinking in quality control, manufactures are now learning to “focus on designing
with minimum loss, with the product being designed as close to optimum as is feasibly
possible”. The use of new philosophy, technology, and advanced statistical tools must be
employed to design high quality products at low cost. Robust design, as this method is
called, is a systemic and efficient approach for finding near optimum combinations of
design parameters. Adherence to this principle ensures that the “financial loss to society”
is kept to a minimum. What significance does this have? Try to recall what the US was
like twenty or thirty years ago. How many recycling bins did you put out for the garbage

collector. How often did you see air quality reports in the newspaper, or hear about
companies being fined for emitting pollutants. If you owned a car twenty years ago, how
often did you take it in to have smog check? Did you use recycled paper back then? The
point is that “a poorly designed product begins to impart losses to society from the very
start of the production stage”. Waste water contamination, industrial noise, smog, acid
rain, are all pollutants which result from a poor quality product. As US manufacturers
begin to understand this first part of the Tacguchi method we begin to see more and more
concern for the environment. Thus the government has passed smog certification laws
and strict controls on the pollutants which may be emitted from factories. As a society
we are recycling more and more.

While many manufactures still consider these restrictions as financial losses rather than
social advances, they have not been slow in adopting the second impact of Taguchi’s
method: decrease in excessive variation in functional performance. Taguchi suggests
that the functional performance of a design can be optimized throughout the design
process. This is done by the use of the Design Of Experiments (DOE) approach. This
approach leads the designer away from the “design within tolerance method” to the
“design for optimized performance” method.         The utilization of this technique has
resulted in a well networked manufacturing environment. The ability of engineers,
designers, marketing agents, and managers to communicate amongst each other in an
efficient manner allows the product to be optimized throughout the design process. IC
circuits are now developed so that the not only perform at the necessary frequency and
within the desired volume, but they are also designed to fit compactly into a lap top, or to
allow the most efficient fluid flow about throughout a computer casing. Automobiles are
not simply designed to have power and good looks. A lot of time and energy goes into
the development of heating systems which are integrated into the engine compartment.
Catalytic converters and mufflers are developed in conjunction with the engine and the
body to produce a compact sporty design while producing a minimum of pollutants.
Even buildings are designed in this manner. No longer do we build sprawling buildings.
Instead the building is developed with a minimum footprint. The heating systems are

optimized to minimize energy waste. Even the lighting is optimized to provide superior
lighting conditions while using as little electricity as possible.

Thorough design to prevent loss has many results which we as a nation are just beginning
to discover. As we develop these techniques and apply them we gain the added benefits
of high quality final products, and a reduction in the cost of the manufacturing of the
product. There is a reduced loss to society as well as to wasted material. The use of the
Taguchi method optimizes product and producer at the same time.


1. Robust quality, Genichi Taguchi and Don Clausing, Harvard Business Review,
   January-February 1990.

2. Total quality management, K. M. Ragsdell, Manufacturing Review Vol. 7, No. 3,
   September 1994.

3. Quality by design, Don Clausing and Bruce H. Simpson, Quality Progress, January

4., 10/12/98, 9:55 PM.

5., 10/12/98, 9:35 PM.

6., 10/12/98, 2:11 PM.


Shared By: