Document Sample

ARTICLE IN PRESS Reliability Engineering and System Safety 94 (2009) 830–837 Contents lists available at ScienceDirect Reliability Engineering and System Safety journal homepage: www.elsevier.com/locate/ress An efﬁcient particle swarm approach for mixed-integer programming in reliability–redundancy optimization applications Leandro dos Santos Coelho Ã ´ -a ´ Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontiﬁcal Catholic University of Parana, PUCPR, Imaculada Conceic ˜o, 1155, 80215-901 Curitiba, Parana, Brazil a r t i c l e in f o a b s t r a c t Article history: The reliability–redundancy optimization problems can involve the selection of components with Received 12 November 2007 multiple choices and redundancy levels that produce maximum beneﬁts, and are subject to the cost, Received in revised form weight, and volume constraints. Many classical mathematical methods have failed in handling 29 August 2008 nonconvexities and nonsmoothness in reliability–redundancy optimization problems. As an alternative Accepted 1 September 2008 Available online 16 September 2008 to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to ﬁnd an almost global optimal solutions. One of these meta-heuristics Keywords: is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique Reliability–redundancy optimization inspired by social behavior of bird ﬂocking and ﬁsh schooling. This paper presents an efﬁcient PSO Particle swarm optimization algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability– Evolutionary algorithm redundancy optimization problems. In this context, two examples in reliability–redundancy design Meta-heuristics problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability–redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature. & 2008 Elsevier Ltd. All rights reserved. 1. Introduction reliability–redundancy optimization problems. One of these modern meta-heuristics is the particle swarm optimization (PSO). In 1952, the Advisory Group on the Reliability of Electronic PSO, ﬁrst introduced by Kennedy and Eberhart [23,24], is a Equipment deﬁned the reliability in a broader sense: reliability stochastic global optimization technique inspired by social indicates the probability implementing speciﬁc performance or behavior of bird ﬂocking or ﬁsh schooling. It simulated the feature function of products and achieving successfully the objectives of bird ﬂocking and ﬁsh schooling to conﬁgure the heuristic within a time schedule under a certain environment [1]. A design learning mechanism. PSO is initialized with a population of engineer often tries to improve system reliability with a basic random solutions within the feasible range, called particles design, to the largest extent possible subject to constraints on any (individual). The learning procedure of PSO is that the solution component attributes (cost, weight, and volume) of system [2]. of every individual particle is modiﬁed with the cause of its own The problem is to select the optimal combination of components best experience and other individuals’ best experiences. In other and redundancy levels to meet system level constraints while words, the particles ﬂy through the search space inﬂuenced by maximizing system reliability. two factors: one is the individual’s best position ever found Recently, many meta-heuristics [3,4], such as evolutionary (personal best); the other is the group’s best position (global best). algorithms [5–13], tabu search [14,15], ant colony optimization In this case, each particle in PSO ﬂies through the search space [16–19], artiﬁcial immune system [20], fuzzy system [21], with a velocity that is dynamically adjusted according to its own and artiﬁcial neural networks [22] have been employed in and its cognitive and social behaviors. In canonical PSO, a uniform probability distribution to generate random numbers is used. However, the use of other probability distributions may improve the ability to ﬁne-tuning or Abbrevations: PSO, particle swarm optimization; PSO-CA, canonical particle even to escape from local optima. In the meantime, it has been swarm optimization; PSO-CO, particle swarm optimization with constriction proposed the use of the Gaussian [25–27], Cauchy [28,29], factor; PSO-GC, Gaussian probability distribution and also chaotic sequences in ´ exponential [30], Levy [31] probability distribution functions, particle swarm optimization Ã Tel./fax: +55 41 327113 45. and chaotic sequences [32–35] to generate random numbers to E-mail address: leandro.coelho@pucpr.br updating the velocity equation in PSO. 0951-8320/$ - see front matter & 2008 Elsevier Ltd. All rights reserved. doi:10.1016/j.ress.2008.09.001 ARTICLE IN PRESS L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 831 Nomenclature tmax the maximum number of allowable iterations ud a uniformly-distributed random number within the a ´ a constant of Henon map range [0,1] b ´ a constant of Henon map vi the volume of each component in subsystem i c1 the cognitive learning rate vmax the maximum velocity that each particle can make at c2 the social learning rate each iteration f( Á ) the objective function for the overall system relia- xi ¼ [xi1,xi2, y, xin]T the position of the ith particle of popula- bility tion g the set of constraint functions wi the weight of each component in subsystem i gbest the global best particle y1 ´ the output of Henon map gi the ith constraint function y2 ´ a signal of state in Henon map k ´ the iteration number in Henon map C the upper limit on the cost of the system l the vector of resource limitation F the feasible region m the number of subsystems in the system Rs the system reliability n ¼ (n1,n2,n3, y, nm) the vector of the redundancy allocation S the search space for the system V the upper limit on the sum of the subsystems’ ni the number of components in the ith subsystem products of volume and weight pbest the personal best particle Ud a uniformly-distributed random number within the pi ¼ [pi1,pi2, y, pin]T the best previous position of the ith range [0,1] particle W the upper limit on the weight of the system r ¼ (r1,r2,r3, y, rm) the vector of the component reliabilities for li ¼ [li1,li2, y, lin]T the velocity of the ith particle the system j a parameter of design ri the reliability o the inertia weight t the iterations (generations) w the constriction coefﬁcient This paper employs the Gaussian probability distribution and r ¼ (r1,r2,r3, y, rm) is the vector of the component reliabilities for also chaotic sequences in PSO (PSO-GC) design to solve the the system, n ¼ (n1,n2,n3, y, nm) is the vector of the redundancy reliability–redundancy optimization problems. In this context, allocation for the system; ri and ni are the reliability and the two examples in reliability–redundancy design are evaluated. The number of components in the ith subsystem, respectively; f( Á ) is results of proposed PSO-GC algorithm, canonical PSO (PSO-CA), the objective function for the overall system reliability; and l is the and PSO with constriction factor (PSO-CO) are compared. The vector of resource limitation; m is the number of subsystems in novel PSO-GC algorithm outperforms and provide solutions when the system. The goal is to determine the number of component compared with PSO-CA, PSO-CO, and also other techniques and the components’ reliability in each system so as to maximize presented in literature for the two reliability–redundancy opti- the overall system reliability. The problem belongs to the category mization examples [36–38]. of constrained nonlinear mixed-integer optimization problems. The remaining content of this paper is organized as follows. In Section 2, the reliability–redundancy optimization problem is 2.1. Example 1: complex (bridge) system introduced, while the concepts of PSO approaches are explained in Section 3. Section 4 presents the simulation results for two The ﬁrst example problem used to demonstrate the efﬁciency reliability–redundancy optimization problems. Finally, Section 5 of PSO approaches were proposed in [38,40,41]. Fig. 1 represents contains the concluding remarks and further research. the complex (bridge) system analyzed in this paper. The complex (bridge) system optimization problem can be stated as follows [38]: 2. Description of reliability–redundancy optimization problem maximize f ðr; nÞ ¼ R1 R2 þ R3 R4 þ R1 R4 R5 þ R2 R3 R5 The goal of reliability engineering is to improve the reliability À R1 R2 R3 R4 À R1 R2 R3 R5 À R1 R2 R4 R5 system. The reliability–redundancy optimizations are useful for À R1 R3 R4 R5 À R2 R3 R4 R5 þ 2R1 R2 R3 R4 R5 (3) system designs that are largely assembled and manufactured subject to using off-the-shelf components, and also, have high reliability requirements [39]. X m A reliability–redundancy optimization problem can be for- g 1 ðr; nÞ ¼ wi v2 n2 pV i i (4) i¼1 mulated with system reliability as the objective function or in the constraint set. In this work, the reliability–redundancy allocation problem of maximizing the system reliability subject to multiple nonlinear constraints can be stated as a nonlinearly mixed-integer programming model in general form as follows: Maximize Rs ¼ f ðr; nÞ, (1) subject to gðr; nÞpl (2) 0pr i p1; r i 2 <; ni 2 Z þ ; 1pipm where Rs is the reliability of system, g is the set of constraint functions usually associated with system weight, volume and cost, Fig. 1. Representation for the complex (bridge) system. ARTICLE IN PRESS 832 L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 Table 1 Table 2 Data used in complex (bridge) system Data used in overspeed protection system of gas turbine Stage 105ai bi wivi2 wi V C W Stage 105aI bi vi wi V C W T 1 2.330 1.5 1 7 110 175 200 1 1.0 1.5 1 6 250 400 500 1000 h 2 1.450 1.5 2 8 2 2.3 1.5 2 6 3 0.541 1.5 3 8 3 0.3 1.5 3 8 4 8.050 1.5 4 6 4 2.3 1.5 2 7 5 1.950 1.5 2 9 X m 1000 bi ai ½ÀT= lnðri Þbi is the cost of each component with reliability ri at g 2 ðr; nÞ ¼ ai À ½ni þ e0:25ni pC (5) subsystem i; T is the operating time during which the component i¼1 ln r i must not fail; and W is the upper limit on the weight of the X m system. The input parameters deﬁning the of the overspeed g 3 ðr; nÞ ¼ wi ni e0:25ni pW (6) protection system for a gas turbine are shown in Table 2. The data i¼1 shown in Table 2 are also available in [36–38]. where V is the upper limit on the sum of the subsystems’ products of volume and weight, C is the upper limit on the cost of the system, and W is the upper limit on the weight of the system. Eqs. 3. Optimization using PSO algorithms (4), (5) and (6) are constraints about the reliability, system cost and weight, respectively. On the other words, constraint given by PSO is a population-based heuristic global search algorithm Eq. (4) is a combination of weight, redundancy allocation and based on the social interaction and individual experience. In volume. Eq. (5) is a cost constraint, while Eq. (6) is a weight essence, PSO mimics the collective learning of individuals constraint. In this context, niAZ, where Z is the space discrete of when they are in groups, observed through natural behaviors integers, and 0prip1, r i 2 <; where < is set of real numbers, of bird ﬂocks and ﬁsh schools. In these groups, there is a leader 0pipm. The parameters bi and ai are physical features of system who guides the movement of the whole swarm. The movement components. The input parameters deﬁning the of the complex of every individual is based on the leader and on his own (bridge) system are shown in Table 1. The data shown in Table 1 knowledge. In general, it can be said that the model that PSO are also available in [38,40,41]. is inspired assumes that the behavior of every particle is a compromise between its individual memory and a collective 2.2. Example 2: overspeed protection system for a gas turbine memory. In PSO, a point (individual) in the problem space is referred to To evaluate the performance of the PSO approaches for the as particles, is a candidate solution to the optimization problem at mixed-integer nonlinear reliability design problem, the reliabili- hand. Each particle in population (named swarm in PSO) is ty–redundancy optimization problem of the overspeed protection initialized with a random position and search velocity. Each system for a gas turbine [36–38] is also considered. Overspeed particle ﬂies through the problem space and keeps track of its detection is continuously provided by the electrical and mechan- position, velocity and ﬁtness. Moreover, its position (i.e. a ical systems. When an overspeed occurs, it is necessary to cut off solution) and velocity (i.e. change pattern of the solution) are the fuel supply using control valves [36]. adjusted according to its own experience and social cooperation This problem is formulated as the following mixed-integer by its ﬁtness to the environment. nonlinear programming problem, i.e., the problem can be During this iterative process, the behavior of a particle is a stated as compromise among three possible alternatives: (i) following its current pattern of exploration, (ii) going back towards its best Y m previous position, and (iii) going towards the best historic value of Maximize f ðr; nÞ ¼ ½1 À ð1 À r i Þni (7) i¼1 all the particles. A representation of procedure of a classical PSO is presented in Fig. 2. subject to In this section, the PSO approaches validated in this work are X m described. The ﬁrst one is the canonical PSO, which is presented in g 1 ðr; nÞ ¼ vi n2 pV i (8) Section 3.1. In Section 3.2, the PSO based on constriction i¼1 coefﬁcient approach is detailed. The third one is the PSO X m technique based on Gaussian distribution and chaotic sequence g 2 ðr; nÞ ¼ Cðr i Þ½ni þ e0:25ni pC (9) is described in Section 3.3. i¼1 X m 3.1. Standard or canonical PSO approach (CA-PSO) g 3 ðr; nÞ ¼ wi ni e0:25ni pW (10) i¼1 The procedure for implementing the global version of þ canonical PSO is given by the following steps [33,34]: 1pni p10; ni 2 Z Step 1. Initialization random of positions and velocities: Initialize where Z is the space discrete of positive integers, a population of particles with random positions and velocities in the n-dimensional problem space using a uniform probability 0:5pr i p1 À 10À6 ; ri 2 < distribution function at iteration t ¼ 1. where vi is the volume of each component in subsystem i; V is the Step 2. Evaluation of particle’s ﬁtness: Evaluate each particle’s upper limit on the sum of the subsystems’ products of volume and objective function value (ﬁtness). In this paper, the deal of PSO-CA weight; C is the upper limit on the cost of the system; Cðr i Þ ¼ is the maximization of objective function. ARTICLE IN PRESS L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 833 Fig. 3. Representation of modiﬁcation in a search by PSO algorithm. pbest and gbest, respectively. The index of the best particle among all the particles in the population is represented by the symbol g. Factors ud and Ud are uniformly-distributed random numbers within the range [0,1]. The ﬁrst relation in Eq. (11) is used to calculate ith particle’s new velocity by taking into consideration three terms: the particle’s previous velocity, the distance between the particle’s best previous and current position, and, lastly, the distance between swarm’s best experience (the position of the best particle in the swarm) and ith particle’s current position. The velocity in Eq. (11) is also limited by a maximum, vmax, meaning the maximum jump that each particle can make at each iteration (generation). The selected value for vmax should not be too high to avoid oscillations, or too low to explore sufﬁciently and the particle becomes trapped in a local optimal solution. In Eq. (11), the value given to the inertia weight affects the type of search in the following way: a large w will direct the PSO for a global search while a small w will direct the PSO for a local search. Fig. 2. Procedure of a classical PSO approach. This parameter can vary linearly from a larger value to a smaller value in order to make the search global in the early run and local in the end of the run. Step 3. Comparison to pbest (personal best): Compare each Considering these concerns, Shi and Eberhart [42] have found particle’s ﬁtness with the particle’s pbest. If the current value is improvements in the performance of the PSO with a linearly better than pbest, then set the pbest value equal to the current varying inertia weight over the generations. The mathematical value and the pbest location equal to the current location in representation of this concept is given by n-dimensional space. ðt max À tÞ Step 4. Comparison to gbest (global best): Compare the ﬁtness o ¼ ðo1 À o2 Þ þ o2 (13) t max with the population’s overall previous best. If the current value is better than gbest, then reset gbest to the current particle’s array where o1 and o2 are the initial and ﬁnal values of the inertia index and value. weight, respectively. Through empirical studies, Shi and Eberhart Step 5. Updating of each particle’s velocity and position: Each [42] have observed that the good solution can be improved by particle tries to modify its position using the current velocity and varying the value of o from 0.9 at the beginning of the search to its distance from its own best position and the global best particle. 0.4 at the end of the search for most problems. The modiﬁcation of the velocity, vi, and position of the particle, xi, Then by Eq. (12) the ith particle ﬂies towards a new position can be represented by Eqs. (11) and (12) according to its previous position and its velocity, considering Dt ¼ 1. Fig. 3 shows a representation of a searching point by PSO li ðt þ 1Þ ¼ oli ðtÞ þ c1 udðpi ðtÞ À xi ðtÞÞ þ c2 Udðpg ðtÞ À xi ðtÞÞ (11) algorithm. Step 6. Repeating the evolutionary cycle: If a predeﬁned stopping xi ðt þ 1Þ ¼ xi ðtÞ þ Dt li ðt þ 1Þ (12) criterion is met, usually a sufﬁciently good ﬁtness or a maximum where o is the inertia weight; i ¼ 1,2, y, N indicates the index of number of iterations (generations), then output gbest and its particles; t ¼ 1,2, y, tmax indicates the iterations (generations); objective value; otherwise set t ¼ t+1 and go back to Step 2. tmax is the maximum number of allowable iterations; li ¼ [li1,li2, y, lin]T stands for the velocity of the ith particle, xi ¼ [xi1,xi2, y, xin]T stands for the position of the ith particle of population, and 3.2. PSO using constriction coefﬁcient approach (PSO-CO) pi ¼ [pi1,pi2, y, pin]T represents the best previous position of the ith particle. Positive constants c1 and c2 are the cognitive learning Eq. (11) of velocity updating for the canonical PSO algorithm and social learning rates, respectively, which are the acceleration can be considered as a kind of difference equation. A particle’s constants responsible for varying the particle velocity towards velocity is an important parameter because it determines the ARTICLE IN PRESS 834 L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 resolution about the solution regions. The choice of a too small ´ where k is the iteration number. The Henon map is used in this value for vmax can cause very small updating of velocities and ´ work for a ¼ 1.4 and b ¼ 0.3 (the values for which the Henon map positions of particles at each iteration. Hence, the algorithm may has a strange attractor). take a long time to converge and faces the problem of getting The proposed approach called PSO-GC is given by the following stuck to local minima. To overcome these situations, researchers pseudocode: [43,44] have recently proposed improved velocity update rules employing a constriction factor w. In doing so, the velocity equation is updated according to li ðt þ 1Þ ¼ w½li ðtÞ þ c1 udðpi ðtÞ À xi ðtÞÞ þ c2 Udðpg ðtÞ À xi ðtÞÞ (14) If t41 By using a constriction coefﬁcient w expressed as If f1 À ððminðf Þ À f i ðr; nÞÞ=ðminðf Þ À maxðf ÞÞÞgox ´ % chaotic sequence based on Henon map in velocity 2 updating w¼ pﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃﬃ (15) j2 À j À j2 À 4jj li ðt þ 1Þ ¼ w½li ðtÞ þ c1 hi ðpi ðtÞ À xi ðtÞÞ þ c2 Hi ðpg ðtÞ with the parameter j ¼ c1+c2, j44 and w is a function of c1 and c2. Typically, c1 and c2 are both set to be 2.05. Thus, j is set to 4.1 À xi ðtÞÞ (18) and the constriction coefﬁcient w is 0.729. Clerc and Kennedy [43] Else If found that the system behavior could be controlled so that the % Gaussian distribution in velocity updating system behavior has the following features: (i) the system does not diverge in a real value region and ﬁnally can converge; and (ii) li ðt þ 1Þ ¼ w½li ðtÞ þ c1 Gi ðpi ðtÞ À xi ðtÞÞ þ c2 Udðpg ðtÞ the system can search different regions efﬁciently by avoiding À xi ðtÞÞ (19) premature convergence. The PSO-CO uses Eqs. (14), (15), and End If also Eq. (12) to update the position of particles in swarm. This Else If modiﬁcation are simplistic in a sense that they only (gradually) % use Eq. (19) in iteration t ¼ 1 reduce the parameter value(s) without affecting the structure of PSO. li ðt þ 1Þ ¼ w½li ðtÞ þ c1 Gi ðpi ðtÞ À xi ðtÞÞ þ c2 Udðpg ðtÞ À xi ðtÞÞ End If 3.3. Framework of novel PSO method using Gaussian distribution and chaotic sequences (PSO-GC) where min(f) and max(f) are the worst and best objective function In PSO-CA and PSO-CO methods, a uniform probability values (maximization problem) of the swarm at iteration t, distribution to generate random numbers ud and Ud is respectively. The fi(r, n) is the objective function value of ith adopted in velocity equation. The use of Gaussian probability particle; x is a constant positive value in range [0,1]; Gi denotes a distribution [25–27] can improve the ability to ﬁne-tuning or Gaussian random number scaled in the range [0;1] and is even to escape from local optima in PSO design. The mechanisms generated a new for each ith particle; hi and Hi are generated by of Gaussian mutation operations have been studied by Yao et al. ´ Henon map given by Eqs. (18) and (19) normalized in the [45] and Chellapilla [46]. They pointed out that Gaussian range [0,1]. Occasionally, the swarm of particles can converge mutation in evolutionary algorithms is promising at ﬁne-grained onto one point, which means min(f) ¼ max(f). In this case, search. the value {1À[min(f)Àfi(r, n)]/[min(f)Àmax(f)]} of arbitrary In other way, the PSO-CA and PSO-CO methods cannot ensure particle is set to 1. The best particle presents the smaller value the optimization’s ergodicity entirely in phase space, because it is than the other particles in swarm for the expression described random in this method. An alternative is the implementation of a by {1À[min(f)Àfi(r, n)]/[min(f)Àmax(f)]}. PSO approach based on chaotic sequences. Chaos is a kind of Based on the pseudocode of PSO-GC, it can be noted that the characteristic that often exists in nonlinear dynamic systems, good particles in swarm tend to perform exploitation to reﬁne which has been studied and applied in many ﬁelds, such as results by local search using Gaussian distribution, while bad engineering and mathematics [47–49]. Chaotic motion is a kind of particles tend to perform large modiﬁcation to explore space with highly unstable motion of deterministic systems in ﬁnite phase large step using a chaotic sequence since the essence of keeping space and it is especially used as a component of effective diversity is to prevent or perturb the swarm. In other words, optimization algorithms [32–35,50–54] relying on its university, PSO-GC can provide a way to maintain population diversity and to randomcity and sensitivity dependence on the initial conditions of sustain good convergence capacity. chaotic mapping. Inspired by the Gaussian probability distribution and chaotic 3.4. Constraints handling with evaluated PSO approaches motion ideas, this paper provides a novel combined optimization method, which introduces chaotic mapping into PSO based on A key factor in the application of PSO algorithms to solve the ´ Henon map [55] so as to improve the global convergence and also reliability–redundancy optimization problems is how the algo- a Gaussian probability distribution to improve the local conver- rithm handles the constraints relating to the problem. Over the gence. It is a promising to achieve trade-off between exploration last few decades, several methods have been proposed to handle and exploitation and, moreover, effective way of dealing with the constraints in evolutionary algorithms [56–59]. These methods updating of velocity in PSO-CO. can be grouped into four categories: methods that preserve the ´ ´ Henon’s map is a simpliﬁed version of the Poincare map of the feasibility of solutions, penalty-based methods, methods that ´ Lorenz system [47]. The Henon map [55] is given by clearly distinguish between feasible and unfeasible solutions and hybrid methods. y1 ðkÞ ¼ 1 À aðy1 ðk À 1ÞÞ2 þ y2 ðk À 1Þ (16) When PSO algorithms are used for constrained optimization problems, it is common to handle constraints using concepts of y2 ðkÞ ¼ by1 ðk À 1Þ (17) penalty functions (which penalize unfeasible solutions), i.e., one ARTICLE IN PRESS L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 835 attempt to solve an unconstrained problem in the search space S approaches uses the variables vectors n and r, where the using a modiﬁed objective function f (we are maximizing function boundaries are given in Section 2. In this paper, during the for the overall system reliability in this paper) such as evolution process, the integer variables ni are treated as real ( variables, and in evaluating the objective function, the real values f ðxi Þ; if xi 2 F max f ðxi Þ ¼ (20) are transformed to the nearest integer values. f ðxi Þ À penaltyðxi Þ; otherwise: Each optimization method was implemented in Matlab (MathWorks). All the programs were run on a 3.2 GHz Pentium where xi are solutions obtained by PSO approaches, i.e. the IV processor with 2 GB of Random Access Memory (RAM). In order position of the ith particle of population; penalty(xi) is zero and no to eliminate stochastic discrepancy, in each case study, 50 constraint is violated; otherwise it is positive. The penalty independent runs were made for each of the optimization function is usually based on a distance measured to the methods involving 50 different initial trial solutions for each nearest solution in the feasible region F or to the effort to repair optimization method. the solution. For each testing problem, the parameters of the PSO-CA are set In this work, the penalty-based method proposed in [38] as follows: c1 ¼ c2 ¼ 2.05, o linearly decreases from o1 ¼ 0.9 to (details in Section 2.3 of [38]) was used in all PSO designs for infeasible solutions (constraint violation). The adopted approach o2 ¼ 0.4. Moreover, the maximum velocity vmax at each particle are set as 20% of the search space of particles. The parameters of is used to convert a constrained problem to an unconstrained one the PSO-CO are set as follows: c1 ¼ c2 ¼ 2.05. In, PSO-GC, it by modifying the search space. A penalty value is deﬁned to take adopted c1 ¼ c2 ¼ 2.05 and x ¼ 0.2. the constrained violation into account. In method proposed in [38], the terms l are subtracted (maximization problem) of objective function f(r, n) if g(r, n)4l. 4.2. Results and discussion for the complex (bridge) system The swarm size is set to 90 and the stopping criterion tmax was 4. Computation results 150 iterations in all PSO algorithms for the complex (bridge) system. In other words, all PSO algorithms adopt 13,500 cost In this section, it turns to the description and analysis of the function evaluations in each run. results obtained by the optimization tests. Simulation results of all PSO schemes for the complex (bridge) system are listed in Table 3. In terms of mean and 4.1. Parameter settings best (gbest of all runs) f(r, n) results, the PSO-GC approach outperforms PSO-CA and PSO-CO. The best results obtained for Two examples described in Sections 2.1 and 2.2 are employed. the complex (bridge) system using PSO-GC was 0.99988957, as Each particle of swarm in PSO-CA, PSO-CO, and PSO-GC shown in Table 4. Table 5 compared the results obtained in this paper for the complex (bridge) system with those of other studies reported Table 3 in the literature. Note that in complex (bridge) system case, Convergence results of f(r, n) (50 runs) for the complex (bridge) system using PSO the best result reported here using PSO-GC was comparatively algorithms lower than recent studies presented in the recent literature. By using Maximum Possible Improvement (MPI) index given Optimization f(r, n) method by MPI (%) ¼ [Rs(PSO-GC)ÀRs (other)]/[1ÀRs (other)] and pre- Maximum Minimum Mean Standard sented in Table 5, it shows the PSO-GC made improvements in (best) (worst) deviation terms of reliability found by other optimization approaches in literature. PSO-CA 0.99988891 0.99944273 0.99981782 0.00002290 PSO-CO 0.99988946 0.99980434 0.99988464 0.00000377 PSO-GC 0.99988957 0.99987750 0.99988594 0.00000069 Table 5 Comparison of best result for the complex (bridge) system with other results presented in literature Table 4 Best result (50 runs) for the complex (bridge) system Parameter Hikita et al. Hsieh et al. Chen [19] This paper [40] [41] (using PSO- Parameter PSO-CA PSO-CO PSO-GC GC) f(r, n) 0.99988891 0.99988946 0.99988957 f(r, n) 0.9997894 0.99987916 0.99988921 0.99988957 n1 3 3 3 n1 3 3 3 3 n2 3 3 3 n2 3 3 3 3 n3 3 2 2 n3 2 3 3 2 n4 3 4 4 n4 3 3 3 4 n5 1 1 1 n5 2 1 1 1 r1 0.815457 0.830911 0.826678 r1 0.814483 0.814090 0.812485 0.826678 r2 0.872681 0.857333 0.857172 r2 0.821383 0.864614 0.867661 0.857172 r3 0.856412 0.912827 0.914629 r3 0.896151 0.890291 0.861221 0.914629 r4 0.703959 0.647359 0.648918 r4 0.713091 0.701190 0.713852 0.648918 r5 0.768451 0.697767 0.715291 r5 0.814091 0.734731 0.756699 0.715290 MPI (%)a 0.594 0.099 – MPI (%)a 47.564 8.615 0.325 – Slack (g1)b 18 5 5 Slack (g1)b 18 18 19 5 Slack (g2)b 0.000778 0.030411 0.000339 Slack (g2)b 1.854075 0.376347 0.001494 0.000339 Slack (g3)b 4.264769 1.560466 1.560466 Slack (g3)b 4.264770 4.264770 4.264770 1.560466 a a MPI (%) ¼ [Rs(PSO-GC)ÀRs(other)]/[1ÀRs(other)]. MPI (%) ¼ [Rs(PSO-GC)ÀRs(other)]/[1ÀRs(other)]. b b Slack is the unused resources. Slack is the unused resources. ARTICLE IN PRESS 836 L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 Table 6 Convergence results of f(r, n) (50 runs) for the overspeed protection system for a gas turbine using PSO algorithms Optimization method f(r, n) Maximum (best) Minimum (worst) Mean Standard deviation PSO-CA 0.999943 0.999733 0.999901 0.000008 PSO-CO 0.999944 0.999746 0.999878 0.000010 PSO-GC 0.999953 0.999638 0.999907 0.000011 Table 7 of the results by PSO approaches in 50 independent runs is also Best result (50 runs) for the overspeed protection system for a gas turbine very small. The best results obtained for the overspeed protection system Parameter PSO-CA PSO-CO PSO-GC using PSO-GC was 0.999953, as shown in Table 7. From Table 8, it f(r, n) 0.999943 0.999944 0.999953 can conclude that a best solution found by PSO-GC for the n1 5 5 5 overspeed protection system has a slight advantage over the other n2 6 5 6 solvers reported in the literature. n3 4 4 4 n4 5 6 5 r1 0.897525 0.885764 0.902231 5. Conclusion and further research r2 0.857242 0.878177 0.856325 r3 0.935091 0.955408 0.948145 r4 0.885802 0.865562 0.883156 PSO is one of the most recent stochastic methods developed for MPI (%)a 17.0 16.071 – solving optimization problems. The current study investigates the Slack (g1)b 55 55 55 combination of Gaussian distribution and chaotic sequence in PSO Slack (g2)b 15.467870 0.173677 0.975465 Slack (g3)b 24.801883 15.363463 24.801882 design, and the performance of the proposed PSO-GC is compared with PSO-CA, PSO-CO, and other results presented in literature for a MPI (%) ¼ [Rs(PSO-GC)ÀRs(other)]/[1ÀRs(other)]. two cases studies including discrete and continuous decision b Slack is the unused resources. variables in reliability engineering ﬁeld. Simulation results presented in Tables 3–8 reveal that PSO-GC Table 8 scheme presented improvements upon the effectiveness and Comparison of best result for the overspeed protection system for a gas turbine efﬁciency and also obtained a good trade-off between exploitation with other results presented in literature in Gaussian distribution and exploration by chaotic sequences. The PSO-GC was demonstrated to be a promising and viable Parameter Dhingra [36] Yokota et al. Chen [38] This paper tool to solve reliability–redundancy optimization problems. [37] (using PSO- GC) Furthermore, a number of improvements and extensions are currently being investigated by the author related to benchmark f(r, n) 0.99961 0.999468 0.999942 0.999953 problems in reliability engineering ﬁeld. n1 6 3 5 5 n2 6 6 5 6 n3 3 3 5 4 Acknowledgments n4 5 5 5 5 r1 0.81604 0.965593 0.903800 0.902231 r2 0.80309 0.760592 0.874992 0.856325 This work was supported by the National Council of Scientiﬁc r3 0.98364 0.972646 0.919898 0.948145 and Technologic Development of Brazil—CNPq—under Grant r4 0.80373 0.804660 0.890609 0.883156 309646/2006-5/PQ. MPI (%)a 87.948 91.165 18.965 – Slack (g1)b 65 92 50 55 Slack (g2)b 0.064 À70.733576 0.002152 0.975465 References Slack (g3)b 4.348 127.583189 28.803701 24.801882 a MPI (%) ¼ [Rs(PSO-GC)ÀRs(other)]/[1ÀRs(other)]. [1] Wang ZH. Reliability engineering theory and practice. ﬁfth ed. Taipei: Quality b Slack is the unused resources. Control Society of Republic of China; 1992. [2] Prasad VR, Kuo W. Reliability optimization of coherent systems. IEEE Trans Reliab 2000;49(3):323–30. [3] Kuo W, Wan R. Recent advances in optimal reliability allocation. IEEE Trans 4.3. Results and discussion for the overspeed protection system Syst Man Cybern Part A Syst Hum 2007;37(2):143–56. [4] Gen M, Yun YS. Soft computing approach for reliability optimization: state-of- for a gas turbine the-art survey. Reliab Eng Syst Saf 2006;91(9):1008–26. [5] Martorell S, Carlos S, Villanueva JF, Sanchez AI, Galvan B, Salazar D, et al. Use The swarm size is set to 15 and the stopping criterion tmax was of multiple objective evolutionary algorithms in optimizing surveillance requirements. Reliab Eng Syst Saf 2006;91(9):1027–38. 150 iterations in all PSO algorithms for the overspeed protection [6] Salazar D, Rocco CM. Solving advanced multi-objective robust designs by system for a gas turbine. All PSO algorithms adopt 2250 cost means of multiple objective evolutionary algorithms (MOEA): a reliability function evaluations in each run. application. Reliab Eng Syst Saf 2007;92(6):697–706. [7] Moghaddam RT, Safari J, Sassani F. Reliability optimization of series–parallel Statistical analyses of all PSO schemes in 50 runs for the systems with a choice of redundancy strategies using a genetic algorithm. overspeed protection system are summarized in Table 6. By using Reliab Eng Syst Saf 2008;93(4):550–6. the results in Table 6, in terms of mean and best (gbest of all runs) [8] Marseguerra M, Zio E, Martorell S. Basics of genetic algorithms optimization f(r, n) results, the solutions of PSO-GC are just slightly better than for RAMS applications. Reliab Eng Syst Saf 2006;91(9):977–91. [9] Tian Z, Zuo MJ. Redundancy allocation for multi-state systems using physical the solution found by PSO-CA and PSO-CO approaches for the programming and genetic algorithms. Reliab Eng Syst Saf 2006;91(9): overspeed protection system. In addition, the standard deviation 1049–56. ARTICLE IN PRESS L.S. Coelho / Reliability Engineering and System Safety 94 (2009) 830–837 837 [10] Ramirez JE, Coit DW, Konank A. Redundancy allocation for series–parallel [33] Coelho LS, Lee CS. Solving economic load dispatch problems in power systems systems using a max–min approach. IEE Trans 2004;36(9):891–8. using chaotic and Gaussian particle swarm optimization approaches. Electr ´ [11] Martorell S, Sanchez A, Carlos S, Serradell V. Alternative and challenges in Power Energy Syst 2008;30(5):297–307. optimizing industrial safety using genetic algorithms. Reliab Eng Syst Saf [34] Coelho LS, Mariani VC. A novel chaotic particle swarm optimization approach 2004;86(1):25–38. ´ using Henon map and implicit ﬁltering local search for economic load [12] Giuggioli P, Marseguerra M, Zio E. Multiobjective optimization by genetic dispatch. Chaos, Solitons Fractals, doi:10.1016/j.chaos.2007.01.093 [accepted algorithms: application to safety systems. Reliab Eng Syst Saf 2001;72(1): for publication]. 59–74. [35] Coelho LS. Novel Gaussian quantum-behaved particle swarm optimiser [13] Rocco CM, Moreno JA, Carrasquero N. A cellular evolution approach applied to applied to electromagnetics design. IET Sci Meas Technol 2007;1(5): reliability optimization of complex systems. In: Proceedings of annual 290–4. reliability and maintainability symposium, Los Angeles, USA, 2000. p. 210–5. [36] Dhingra AK. Optimal apportionment of reliability & redundancy in series [14] Ouzineb M, Nourelfath M, Gendreau M. Tabu search for the redundancy systems under multiple objectives. IEEE Trans Reliab 1992;41(4):576–82. allocation problem of homogenous series–parallel multi-state systems. Reliab [37] Yokota T, Gen M, Li HH. Genetic algorithm for nonlinear mixed-integer Eng Syst Saf 2008;93(8):1257–72. programming problems and it’s application. Comput Ind Eng 1996;30(4): [15] Konak SK, Smith AE, Coit DW. Efﬁciently solving the redundancy allocation 905–17. problem using tabu search. IIE Trans 2003;35(6):515–26. [38] Chen TC. IAs based approach for reliability redundancy allocation problems. [16] Liang YC, Smith AE. An ant colony optimization algorithm for the redundancy Appl Math Comput 2006;182(2):1556–67. allocation problem. IEEE Trans Reliab 2004;53(3):417–23. [39] Kulturel-Konak S, Smith AE, Coit DW. Efﬁciently solving the redundancy [17] Nahas N, Nourelfath M. Ant system for reliability optimization of a series allocation problem using tabu search. IIE Trans 2003;35(6):515–26. system with multiple-choice and budget constraints. Reliab Eng Syst Saf [40] Hikita M, Nakagawa H, Harihisa H. Reliability optimization of systems by a 2005;87(1):1–12. surrogate constraints algorithm. IEEE Trans Reliab 1992;41(3):473–80. ˆ [18] Samrout M, Yalaoui F, Chatelet E, Cheboo N. New methods to minimize the [41] Hsieh YC, Chen TC, Bricker DL. Genetic algorithm for reliability design preventive maintenance cost of series–parallel systems using ant colony problems. Microelectron Reliab 1998;38:1599–605. optimization. Reliab Eng Syst Saf 2005;89(3):346–54. [42] Shi Y, Eberhart RC. Empirical study of particle swarm optimization. In: [19] Nahas N, Nourelfath M, Kadi DA. Coupling ant colony and the degraded ceiling Proceedings of IEEE international conference on evolutionary computation, algorithm for the redundancy allocation problem of series–parallel systems. vol. 3, Washington, DC, USA, 1999. p. 101–6. Reliab Eng Syst Saf 2007;92(2):211–22. [43] Clerc M. The swarm and queen: towards a deterministic and adaptive particle [20] Chen TC. IAs based approach for reliability redundancy allocation problems. swarm optimization. In: Proceedings of IEEE congress on evolutionary Appl Math Comput 2006;182(2):1556–67. computation, Washington, DC, USA, 1999. p. 1951–7. [21] Mahapatra GS, Roy TK. Fuzzy multi-objective mathematical programming on [44] Clerc M, Kennedy JF. The particle swarm: explosion, stability, and conver- reliability optimization model. Appl Math Comput 2006;174(1):643–59. gence in a multi-dimensional complex space. IEEE Trans Evol Comput [22] Habib A, Alsieidi R, Youssef G. Reliability analysis of a consecutive r-out-of-n: 2002;6(1):58–73. F system based on neural networks. Chaos Solitons Fractals 2007 [accepted to [45] Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Trans Evol future publication]. Comput 1999;3(2):82–102. [23] Eberhart RC, Kennedy, JF. A new optimizer using particle swarm theory. In: [46] Chellapilla K. Combining mutation operators in evolutionary programming. Proceedings of the sixth international symposium on micro machine and IEEE Trans Evol Comput 1998;2(3):91–6. human science. Nagoya, Japan, 1995. p. 39–43. ¨ [47] Peitgen HO, Jurgens H, Saupe D. Chaos and fractals: new frontiers of science. [24] Kennedy JF, Eberhart RC. Particle swarm optimization. In: Proceedings of the second ed. New York, USA: Springer; 2004. IEEE international conference on neural networks, vol. IV, Perth, Australia, [48] Parker TS, Chua LO. Practical numerical algorithms for chaotic system. Berlin, 1995. p. 1942–8. Germany: Springer; 1989. [25] Krohling RA. Gaussian swarm: a novel particle swarm optimization [49] Strogatz SH. Nonlinear dynamics and chaos. Massachussetts: Perseus algorithm. In: Proceedings of the IEEE conference on cybernetics and Publishing; 2000. intelligent systems (CIS), Singapore, 2004. p. 372–6. [50] Yang D, Li G, Cheng G. On the efﬁciency of chaos optimization algorithms for [26] Secrest BR, Lamont GB. Visualizing particle swarm optimization–Gaussian global optimization. Chaos Solitons Fractals 2007;34(4):1366–75. particle swarm optimization. In: Proceedings of the IEEE swarm intelligence [51] Caponetto R, Fortuna L, Fazzino S, Xibilia MG. Chaotic sequences to improve symposium. Indianapolis, IN, USA, 2003. p. 198–204. the performance of evolutionary algorithms. IEEE Trans Evol Comput [27] Higashi N, Iba H. Particle swarm optimization with Gaussian mutation. In: 2003;7(3):289–304. Proceedings of the IEEE swarm intelligence symposium. Indianapolis, IN, USA, [52] Li B, Jiang W. Optimizing complex functions by chaos search. Cybernet Syst 2003. p. 72–9. 1998;29(4):409–19. [28] Coelho LS, Krohling RA. Predictive controller tuning using modiﬁed particle [53] Zilong G, Sun’an W, Jian Z. A novel immune evolutionary algorithm swarm optimisation based on Cauchy and Gaussian distributions. Soft incorporating chaos optimization. Pattern Recognition Lett 2006;27(1):2–8. computing: methodologies and applications, Springer Engineering Series in [54] Jiang C, Ma Y, Wang C. PID controller parameters optimization of hydro- Advances in Soft Computing, 2005. p. 287–98. turbine governing systems using deterministic–chaotic-mutation evolution- [29] Li C, Liu Y, Zhou A, Kang L, Wang H. A fast particle swarm optimization ary programming (DCMEP). Energy Convers Manage 2006;47(9–10): algorithm with Cauchy mutation and natural selection strategy. In: Lecture 1222–30. Notes in Computer Science, Advances in Computation and Intelligence, ´ [55] Henon M. A two dimensional mapping with a strange attractor. Commun vol. 4683. Berlin, Germany: Springer; 2007. Math Phys 1976;50(1):69–77. [30] Krohling RA, Coelho LS. PSO-E: particle swarm with exponential distribution. [56] Michalewicz Z, Schoenauer M. Evolutionary algorithms for constrained IEEE world congress on computational intelligence. In: Proceedings parameter optimization problems. Evol Comput 1996;4(1):1–32. of IEEE congress on evolutionary computation. Canada: Vancouver; 2006. [57] Koziel S, Michalewicz Z. Evolutionary algorithms, homomorphous mapping p. 5577–82. and constrained parameter optimization. Evol Comput 1999;7(1):19–44. ´ [31] Richer TJ, Blackwell TM. The Levy particle swarm. IEEE World congress on [58] Runarsson TP, Yao X. Evolutionary search and constraint violations. In: computational intelligence. In: Proceedings of IEEE congress on evolutionary Proceedings of conference on evolutionary computation, vol. 2, Canberra, computation. Canada: Vancouver; 2006. p. 808–15. Australia, 2003. p. 1414–9. [32] Cai J, Ma X, Li L, Haipeng P. Chaotic particle swarm optimization for economic [59] Coello Coello CA. Theoretical and numerical constraint-handling techniques dispatch considering the generator constraints. Energy Convers Manage used with evolutionary algorithms: a survey of the state of the art. Comput 2007;48(2):645–53. Methods Appl Mech Eng 2002;191(11–12):1245–87.

DOCUMENT INFO

Shared By:

Categories:

Tags:

Stats:

views: | 8 |

posted: | 3/29/2012 |

language: | |

pages: | 8 |

OTHER DOCS BY Badrilaldeshwali

How are you planning on using Docstoc?
BUSINESS
PERSONAL

By registering with docstoc.com you agree to our
privacy policy and
terms of service, and to receive content and offer notifications.

Docstoc is the premier online destination to start and grow small businesses. It hosts the best quality and widest selection of professional documents (over 20 million) and resources including expert videos, articles and productivity tools to make every small business better.

Search or Browse for any specific document or resource you need for your business. Or explore our curated resources for Starting a Business, Growing a Business or for Professional Development.

Feel free to Contact Us with any questions you might have.