optimal plan for probabilistic parameter design

1 downloads 0 Views 115KB Size Report
reduction of the variation and the well known Taguchi method (TM) which has .... and tabulated by Dr. Genichi Taguchi, by providing the corresponding linear ...
Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011

OPTIMAL PLAN FOR PROBABILISTIC PARAMETER DESIGN Manuel R. Piña-Monarrez1, Manuel. A. Rodríguez-Medina,2 and Salvador A. Noriega-Morales1 1

Department of Industrial and Manufacturing Engineering Engineering and Technology Institute Universidad Autónoma de Ciudad Juárez Cd. Juárez Chih. México, C.P. 32310 2

Division of Research and Graduate Studies Ciudad Juárez Institute of Technology Corresponding author’s e-mail: [email protected]

Abstract: Since the sample used to determine the normal parameters in probabilistic design is random, the estimated parameters in the probabilistic method are also random. As a consequence, generally the designed parameter does not comply with the expected six sigma quality indices. In this paper, we propose to model the significant environmental variables as a complement of the design methodology. Result of the application is a significant advantage between the standard probabilistic design and the proposed methodology.

1. INTRODUCTION In general, the goal of all industrial processes should be to simultaneously improve quality, reduce cost under a short development time. By optimizing performance we improve quality and by improving quality through optimization and variation reduction the cost is reduced, because of the waste reduction as scrap, rework and costly inspection; the most widely accepted means of improvement are statistical control process (SPC), which has been proven to be effective for the reduction of the variation and the well known Taguchi method (TM) which has made experimental design popular because of its effectiveness for both optimization and variation reduction, but neither SPC nor TM are the only tools to be applied in the Six Sigma methodology. According to Taylor (1991), there are other methods among them, the Multi-vari charts, analysis of means, components swapping and variation transmission studies, and the trick is to make the product robust to variation in the design phase of the product and process, and to tighten up on only the critical tolerances and only to the degree required. In another hand tolerances on non-critical specifications could be relaxed to achieve cost reduction. The most general optimization functions in robust analysis are larger the better, smaller the better, nominal the better, target function, uniform around the average and dynamic characteristics. For details see Taylor (1991), Wu (1996) and Yang (2003), among others. No matter which optimization function is used, all try to achieve the most desirable average and minimize the variation around this average. This is because no matter how well the average is optimized, excessive variation can result in poor quality, and no matter how much the variation is reduced a poor average can result in poor quality too. However, in any robust optimization process, optimizing the mean maximizes the average and reduces the variance, with the objective that each unit of product performs as close to the average as it is possible. Robust means that the performance of the product is minimally affected by the variation on the manufacturing process, its use and performance variation (deterioration). Usage variation refers to the manner and conditions under which the product is used. It is outside the control of manufacturing, SPC and the variation reduction program; the plant cannot do anything to reduce it. Usage variation must be addressed during the design product in order to be robust. Also, it could be possible by simulation of the use in plant. Since performance variation is present only after the product was used, the objective is not how well the product performed the first time, the objective is how well the product functions over its total useful life, (measuring variation due to deterioration requires simulating aging), as is the case of the accelerated life test. In order to incorporate in the analysis the effect related to the usage, in this paper, the significant variables were modeled by associating a parametrical distribution and then optimizing the joint performance with the six sigma requirements (3.4 ppm´s). The structure of the paper is as follows. Section 2 presents the needed normal distribution theory, section 3 gives generalities on six sigma methodology, section 4 shows an application of the probabilistic parameter to hypothetical data and in section 5 some conclusions are discussed, the paper finally ends in section 6 with the references. © International Journal of Industrial Engineering ISBN # 97819346601-3-3

Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011

2. NORMAL DISTRIBUTION FUNCTION Normal distribution was first proposed by de Moivre (1667-1754). Later Carl Friedrich Gauss (1777-1855) formulated its equation. Normal distribution is represented by the Gaussian bell. Many univariate tests and confidence intervals are based on the univariate normal distribution. Similarly, most of multivariate procedures have the multivariate normal distribution as their underpinning. The following are some of the useful features of the univariate normal distribution 1) the distribution is completely described by the mean ( µ ) and variance (σ 2 ) ; 2) The localization parameters median, mean and mode have the same value (symmetrical around the mean), 3) It is asymptotic to the abscissa, 4) It has inflexion points at ± σ . 5) Its shape depends of the parameter c. It is to say, there is not only one normal distribution, instead, a family exists, among which the most common is the standard normal distribution, for which the mean is zero and the variance is equal to one. This property results useful in practice to determine the probability of interest success, for details see Rencher “2002). The normal probability density function is given by equation 1

f ( x) =

 −( xi − µ ) 2 exp   2σ σ 2π  1

  , −∞ < µ < ∞ and σ > 0 

(1)

The normal standard statistics for (1), is given by Z=

xi − µ σ

(2)

The statistical given in (2), in its sample form is given by: Zx =

xi − X

(3)

σ/ n

Where X is the sample mean and σ / n is the standard deviation of the sample distribution (Rencher 2002). Finally, if a random variable y, with mean µ and variance σ 2 , is normally distributed, its density is given by (1). When y has the density of (1), we say that y is distributed as N (µ, σ 2 ) , or simply y is N (µ, σ 2 ) .

2.1. Reproducibility Normal Property Let X 1 , X 2 ,..., X n be n normal independent random variables with µi and variance σ 2j , with: X ≈ N ( µ i , σ 2j )

(4)

If we define a new variable Z = a 0 + a1 X 1 + a 2 X 2 + ... + a n X n

(5)

Then, for the reproducibility normal property, Z follows a normal distribution with Z ≈ N ( µzi , σ z2 ) where n

µz = a0 + ∑ ai µi

(6)

i =1

n

σ z2 = ∑ (ai σ i ) 2

(7)

i =1

377

Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011 For two normal random variables, we can define any function, for example g = R − S where if g ( R , S ) > 0 implying that R is greater than S and if g ( R , S ) < 0 then S is smaller than R. In this case, let us define the limit of both variables as g ( R , S ) = 0 . Let, for example R represent the strength and S represent the stress for unit of load, then, the failure probability is given by the probability that g < 0 . This probability is a function of R and S is given as p [ g ( R , S ) ] < 0 . Now, let us define a new function as Z = g ( x) = R − S

(8)

Then, the failure probability could be computed as p f = p ( Z < 0) = p[( Z − µ z ) / σ

z=

(0 − µ z ) / σ

z=

 µ  Φ− z   σz 

(9)

If we define the security index as β=

µz σz

(10)

Then, the probability that the system reaches this limit, under the normal function, could be computed as: p f = Φ (− β )

(11)

Where Φ , is the normal cumulative function defined in (1). 3. DESIGN FOR SIX SIGMA Delivering reliable and high quality products and processes at low cost has become the key in the present global economy. The strategies now are focusing on optimization, mainly in the design phase. Thus, new philosophies, technologies and advanced statistical tools must be employed to optimize product and process designs. The most generalized philosophy is the well known Six Sigma methodology. Six Sigma provides tools that the companies need to improve the capability of their business processes by its DMAIC process. In Six Sigma, a process is the basic unit for improvement, where the purpose is to increase its performance and decrease its variation. This will lead to defect reduction and improvement in profits. Six Sigma, was developed by Motorola in the mid-1980s, the methodology became well known after Jack Welch from GE made it a central focus of the business strategy in 1995. The terms “Six Sigma” derive from statistical terminology; Sigma (σ ) means standard deviation. For normal distribution, the probability of falling within a 6 sigma range around the mean is 0.9999966. In a production process, the “Six Sigma standard” means that the defects rate of the process will be 3.4 defects per million units. Clearly Six Sigma indicates a degree of extremely high consistency and extremely low variability. In particular, robust design is the main focus of six sigma methodology, which consists in the determination of the best set of settings for the control factors of a product (or process). Among others, the main methodologies used in robust design are the well known Dual Response Surface Methodology (DRSM) and Taguchi Method (TM), where TM is the most generalized strategy and depends heavily on the design of experiments statistical theory. Observe that in TM, generally the normal assumptions are not questioned. According to TM philosophy this is no problem, because the objective of TM is to find significance or interaction among the control and noise factors instead of interaction among control factors. In particular, the interaction among (or between) control factors are minimized by using the well known orthogonal arrays first introduced in 1930 by R. A. Fisher and L. H. C. Tippett in England (Fisher 1925) and (Tippett 1934) and then simplified and tabulated by Dr. Genichi Taguchi, by providing the corresponding linear graphs. Control factors are those factors that we could control during the experimentation, and noise factors are those parameters that are uncontrollable or are too expensive to control. The robust design method provides a systematic and efficient approach for finding the near optimum combination of design parameters so that the product is functional and exhibits a high level of performance, besides of 378

Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011 being robust to noise factors. Clearly robust design must be applied during the design phase when the product or process has the greatest potential impact on life cycle cost and quality. The three major steps in robust design are; system design, parameter design, and tolerance design (Taguchi 1986). System design consists in the generation of a basic functional prototype design, which defines the configuration and attributes of the designed product. Generally, the initial design may be functional but it may be far from optimum in terms of quality and cost. The purpose of parameter design is to identify the set of settings of parameters that optimizes the performance and reduces the sensitivity to the source of variations. In particular, parameter design, requires some experimentation for the evaluation of the effect that noise factors have over the quality characteristic, also is needed the identification of the factors that affect the mean, those that affect the variance and those who affect both, the mean and variance, in order to select the best set of levels of the parameters for which the product is functional, exhibits a high level of performance under a wide range of conditions, and is robust to noise factors. Finally tolerance design is the process of determining tolerances around the nominal settings. Tolerance design is required if robust design cannot produce the required performance without costly special components or high process accuracy (Bendell, 1988). Typically tightening tolerances leads to higher cost, so the recommendation is to tighten up on only the critical tolerances (six sigma performance cp>2) and only to the degree required (specific % of defective parts). Tolerances on non-critical specifications could be loosened to achieve cost reduction. The steps for robust design, accordingly with Phadke (1989) are: 1. Identify the main function, 2. Identify the noise factors and testing conditions, 3. Identify the quality to be observed and the objective function to be optimized, 4. Identify the control factors and their alternative levels, 5. Design the experiment, 6. Conduct the experiment, 7. Analyze the data and determine the optimum levels for the control factors and 8. Predict the performance at these levels.

4.

APPLICATION

The details of the proposed steps to robust design are explained using the simple structural design depicted in Figure 1.

Load Ld~N(5,1)

L = 10

h=4 Strength R~N(35,2)

W

Figure1. Schematic representation of the structural element In Figure 1, L represents the length in inches of the structural element, Ld is a random variable and represents the total load with the normal behavior given by N(5,1) Newtons; R is also a random variable and represents the expected operational resistance (strength) with the structural element, it will be subjected under expected values following a normal distribution with N(35,2) Newtons, h is a constant parameter and represents the height of the structural element (given in inches), and W is the wide to be determined, in inches (amplitude). W will be determined so that the designed structural element will be functional to the designed load (robust to variation), exhibits a high performance (only less than 1% of the designed structural elements will fail), and complain with the six sigma requirements (3.4 ppm´s, cpk>1.5). By steps the analysis is as follows: • The main function of the designed structural element is that it must be capable to be subjected to random load rounding into N(5,1) Newtons, and exhibit a high performance (only less than 1% of the designed structural elements will fail), and complain with the six sigma requirements (3.4 ppm´s, cpk>1.5). • The noise considered factors are the random load (Ld- N(5,1) Nw) and the variation of the material that determine the variation on the resistance R N(35,2) Nw. • The quality characteristic to be observed is the output of the final resistance in the designed structural element and the objective function to be optimized is high resistance and low variation. • The control factors in this simple example are the parameter h=4in, L=10in, Ld=5±1 Nw and R=35±2 Nw.

379

Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011 •

The experimentation based on the objectives of step one, consists on determining a transfer function to relate the applied load with the resistance, and be able to test the probability of defective parts and indexes for six sigma performance. Since R and S complain with (4), and by considering that they are independent, then, based on (5), the transfer function is defined by a new variable as in (8) as Z = g ( x ) = R − S . Clearly the defective proportion is expected to be less than 1%, could be measured using (9) or (11). The mean and variation are estimated by applying (6) and (7), to do this, lets relate the total load (Ld) with the corresponding stress per unit (S), as follows S=

Ld * L * ( h / 2 ) W * h 3 / 12

=

6 Ld * L W * h2

=

60 Ld 3.75 Ld . With the defined stress S, the mean of the transfer function, = 16W W

according with (6) is given by µ z = 3 5 − 1 8 .7 5 / W and the corresponding variance according with (7) is 2 . Based on (11) the critical value of the normal distribution that corresponds to 1% (or less) σ z = 4 + 1 4.06 2 5 / W defects, is β = 2 .3 2 6 3 4 7 87 4 ≈ 2 .3 3 . •



Using β = 2.33 in (10) and solving for W, the structural element must be designed with a minimum wide of W = 0.807525 in. Finally the tolerance to complain with the six sigma requirement of 3.4 ppm´s or less, is determined by taking (2), Z=4.5 and µ = 0.807525in , and σ = 0.01 (represents the requirement of no more than 1% defects) and solving for X=W, the wide that complains with both 1% and 3.4 ppm´s is W = 0.852525 ± 0.0450 in . With R between N(35,2) Nw and Ld [N(5,1) Nw], the critical design R value is the lower limit of R=33 Nw. By performing a simulation using as parameters, R=33Nw, Ld [N(5,1) Nw and W = 0.855 ± 0.0450 in , the expected indexes after 50000 simulated runs using the function S = 3 .7 5 * L d / W are presented in Figure 2.

Figure 2. Six Sigma indexes for W = 0.855 ± 0.0450 in

. •

As can be seen from Figure 2, for W = 0.855 ± 0.0450 in , we could expect a Cpk less than 1.5, with 6,583.21 ppm´s, instead of the designed 3.4 ppm´s. Because W = 0.855 ± 0.0450 in parameter, clearly from Figure 2, does not complain with the design requirements, given that the noise factors affect the mean and the standard deviation, a complementary simulation was performed to find the W parameter for which the designed levels exhibit less than 1% of defects, and the manufacturing process has 3.4 ppm´s or less. The found Six Sigma indexes for a random W with uniform behavior in the interval [0.50-2.0] are presented in Figure 3.

380

Proceedings of the 16th Annual International Conference on Industrial Engineering Theory, Applications and Practice Stuttgart, Germany September 20-23, 2011

Figure 3. Six Sigma indexes for W under a uniform random behavior The robust W parameter was found to be W = 1.08008 ± 0.0450 in , with the performance six sigma indexes shown in Figure 3. 5.

CONCLUSION

Optimizing a product or process requires that both reach the best suitable mean with a low variance around it. Since the sample size taken of a process is random, the estimated parameters are also random. Robust products require to be tested in the environment where they will be used and estimated the deterioration given by the use. The incorporation of these sources of variation during the design phase is the key for robust design. Reducing variation decreases the number of defects thereby, improves quality and results in wider operating windows, making the process easier to control and with higher capability as is the case with Six Sigma (Cpk>1.5). To select the best suitable analysis we must use the best fitted probability function, as is the case of the normal, lognormal or exponential function. The best analysis is made with the focus of high quality and reliability from the design phase. The project gave the expected results; in particular, the designed parameters must be submitted to the noise factors in order to determine the best set of robust parameters and for the analysis with experimental data we propose the use of the well know TM and DRSM for data recollection and analysis.

4. REFERENCES Bendell A. (1988). “Introduction to Taguchi Methodology”, Taguchi Methods: Proceedings of the 1988 European Conference: Elsevier Applied Science, London, England, pp1-14. Fisher R. A. (1925). Statistical Methods for Research Workers, Oliver and Boyd, London. Phadke S. M. (1989). Quality Engineering Using Robust Design. Prentice Hall, Englewood Cliffs, N. J. Rencher A. C. (2002). Methods of Multivariate Analysis, John Wiley and Sons, New York. ISSN: 0-471-41889-7. Taguchi G. (1986). Introduction to Quality Engineering. Asian Productivity Organization, American Supplier Institute Inc. Taylor W. A. (1991). Optimization and Variation Reduction in Quality. American Society for Quality and McGraw-Hill Inc. SBN: 0-9635122-1-8. Tippett L. C. H. (1934). Application of Statistical Methods to the Control of Quality in Industrial Production, Journal of Manchester Statistical Society, England Wu Y. and Wu A. (1996). Diseño Robusto Utilizando los Métodos Taguchi. Diaz de Santos. ISBN: 84-7978-305-2. Yang K. and El-Haik B. (2003). Design for Six Sigma: A roadmap for Product Development. Mc-Graw-Hill. ISSN:0-07143599-9.

381