Application of Evolutionary Strategies to Optimal Design of ... - CiteSeerX

18 downloads 86867 Views 169KB Size Report
Application of Evolutionary Strategies to Optimal Design of Multibody Systems. SELIM DATOUSSAID ([email protected]), OLIVIER. VERLINDEN ...
Application of Evolutionary Strategies to Optimal Design of Multibody Systems SELIM DATOUSSAID ([email protected]), OLIVIER VERLINDEN ([email protected]) and CALOGERO CONTI ([email protected]) Service de Mécanique Rationnelle, Dynamique et Vibrations, Faculté Polytechnique de Mons Bd Dolez, 31; B-7000 Mons (Belgium) Abstract. This paper is concerned with the application of evolutionary strategies to the optimization of the kinematic or dynamic behaviour of mechanical systems. Although slower than classical deterministic, gradient-based methods, they represent an interesting alternative: they are global, they should be more robust as they don’t rely on continuity and derivability conditions and they can use the simulation software as is. They are inspired from natural evolution: the design variables corresponding to the genes of an individual mutate from generation to generation and the ones who survive are those that are the best fitted to their environment, that’s to say with the best objective function. The implementation of evolutionary strategies is presented as well as some guidelines to choose the most important parameters. Two examples are developed for the sake of illustration: the kinematic optimization of a suspension and the dynamic optimization of the comfort of a railway vehicle. Finally, the performances are compared with respect to deterministic methods and genetic algorithms. Keywords: multibody systems, kinematics, dynamics, optimization, evolutionary strategy

1. Introduction Multibody simulation codes are now commonly used for designing mechanical systems like, for example, manipulators or vehicles. Such tools allow engineers to virtually test the influence of several design parameters in order to reach optimal performances while respecting technical constraints. However, if the number of design variables is important, the problem can no longer be managed in an intuite manner and will rather be treated by optimization tools. Most contributions to the optimization of multibody systems [1–8] are based on classical deterministic, gradient-based methods [22]. As they need the sensitivities to progress, such methods are complex to implement but are generally efficient. However, they rely on continuity and derivability conditions so that they could fail for some particular problems. Moreover, they are so-called local methods as they converge to the closest local optimum. It is then interesting to explore alternative approaches like the ones based on stochastic principles, among which genetic algorithms and evolutionary strategies. Structural engineers, who combined very early finite element codes and optimization engines, applied successfully genetic algorithms [9–14] and, later, evolutionary strategies [16]. Some authors employed recently genetic © 2002 Kluwer Academic Publishers. Printed in the Netherlands.

article.tex; 15/04/2002; 10:53; p.1

2

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

algorithms for the kinematic and dynamic optimization of multibody systems [17, 19, 18]. This paper will focus on the application of evolutionary strategies, in the purpose of showing that they represent an interesting alternative for multibody systems as well. The paper first recalls the general form of an optimization problem, the peculiarities which arise when it is applied to multibody systems and the main reasons why evolutionary strategies constitute an interesting alternative to the classical deterministic, gradient-based methods. The evolutionary strategies are presented in section 3, as well as some guidelines to choose the principal parameters. The application is illustrated by two examples: the kinematic optimization of a suspension and the dynamic optimization of the comfort of a railway vehicle. The performances are finally compared with respect to deterministic methods and genetic algorithms.

2. Optimization of Multibody Systems A constrained optimization problem consists in finding that set of nb design variables bopt leading to the minimal value of a cost-function or objectivefunction ψ0 b  , expressed in terms of the design variables, while respecting nc inequality constraints on the design variables of the form ψi b  0. When applied to multibody systems, optimization encounters peculiar difficulties as the objective function is often related to the time evolution of a physical value during a given motion. For example, it is asked to minimize the maximum acceleration of the passengers during the maneuver of a transport vehicle. The objective function is time-dependent and can be written in the following form ψ0 b 

max f0 b q q˙ q¨ t 

t

 0 τ

(1)

with τ the simulation duration, f0 the physical value of interest and q, q˙ and q¨ the vector of configuration parameters of the multibody system and its first and second time-derivatives. The constraints themselves can also be time-dependent. It could be for example a distance condition between bodies so as to avoid interferences. Such a constraint can be expressed by φ b q q˙ q¨ t  0 t

 0 τ

(2)

In order to simplify its mathematical treatment and particularly the determination of derivatives[1, 21], such a constraint is generally transformed into an equivalent time independent form ψ, according to ψ 

τ 0



φ t  dt

(3)

article.tex; 15/04/2002; 10:53; p.2

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

  φ t 

where the equivalent operator



3

is defined by φ t  if φ t  0 0 0 if φ t 



(4)

The design variables can be subjected to simpler constraints, called boundary constraints expressing for example that a physical characteristic, like a stiffness or damping coefficient must remain between lower and upper boundaries bL and bU bL  b b  bU





0 0

(5)

Deterministic, gradient-based methods have been used by several authors to optimize kinematics or dynamics of multibody systems [1–8]. Most of the methods proposed by linear on nonlinear programming can be used as far as they only need the sensitivities. The determination of the latter is however complicated due to the presence of time-dependent cost or constraint functions. Alternative approaches have therefore to be considered. Firstly, if the exact determination of the sensitivities is possible by direct differentiation or by the technique of adjoint variables, it requires the development of new routines in the multibody simulation code. Finite differences don’t seem to be accurate enough in multibody systems and suffer from a lack of comprehensive rules to choose the adequate increment [3, 6]. Extrapolation procedures based on order n polynomials should be precise enough but require more computer ressources [3]. Although less efficient, the so-called direct search methods (polytope, Hooke and Jeeves,. . . [22]) could also be used directly. Their progression is indeed dictated by heuristics which only need the evaluation of the cost and constraint functions. It is of interest to note that the evolutionary strategies presented in this paper can be seen as a direct search method whose heuristics involves stochastic rules. Secondly, gradient-based methods are local methods which converge only to the closest local optimum. On the contrary, evolutionary strategies, by working with a population varying according to stochastic rules have the potential to explore zones where a deterministic method would never have gone. From that point of view, if deterministic methods can be improved by using several starting points, they then loose the advantage of efficiency. Finally, as no optimization method offers the guarantee to yield the global optimum, and even to converge at all, it is preferable to dispose of several tools. Commercial optimization engines are indeed generally endowed with different methods. That is why stochastic methods, and particularly genetic algorithms and evolutionary strategies, have to be considered. Although they inspire some septicism to the rational spirit of engineers, Shütz and Schwefel [15] have

article.tex; 15/04/2002; 10:53; p.3

4

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

demonstrated that they can be successfully applied to complex engineering problems. Evolutionary strategies will be the subject of this paper. Developments about genetic algorithms will be found in other contributions[19, 18].

3. Evolutionary Strategies The evolutionary strategies are iterative optimization methods which try to find an optimal solution from stochastic small variations of the design variables. Schwefel publications constitute a basic reference [24], some more recent developments being described by Fogel [25]. Evolutionary strategies are based on the principles of natural selection: the individuals of a species mutate from generation to generation by small variation of their genes, and only the best fitted to their environment will survive and be selected for further reproduction. Although other phenomena occur in the natural selection, mutation and selection are recognized by Darwin to be the most important ones. The analogy with the optimization problem is quite straightforward: a set of design variables can be considered as the genes of an individual and the value of the objective function for this set of design variables represents the fitness for survival of the corresponding individual. As we are concerned with a minimization process of the objective function, the fitness for survival will be the higher as the objective function is lower. Of course, if the constraints are violated, the individual is supposed to have very low or no fitness at all for survival and will rarely or never be selected. The basic implementation of the evolutionary strategy is composed of the following steps: 1. An initial population of µ parent individuals is selected at random and uniformly in the feasible range of each design variable. Ideally, each initial parent should hold the constraints. 2. A population of λ offsprings is created from mutation of the parents. For each offspring, a parent is randomly selected, and each of its design variable bi is mutated by adding a Gaussian random variable with zero mean and preselected standard deviation σi bi offspring  bi parent  N 0 σi 

i  1 nb

(6)

The standard deviation σi may be chosen for the whole population or may be linked to the individual, in a similar manner as the design variables. A Gaussian distribution insures that, like in the nature, small changes occur more frequently than large ones. 3. The new parents are selected as the µ individuals with the best fitness, that’s to say with the lowest cost function. The new parents may be cho-

article.tex; 15/04/2002; 10:53; p.4

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

5

sen either from the set of parents and offsprings (plus strategy µ  λ) or only from the offsprings (comma strategy µ λ). 4. The process of mutation-selection continues until a satisfactory solution is reached. The convergence criterion will be explained later. Let us note that the plus strategy with one parent and one offspring (1+1) is referred to as the two membered evolutionary strategy. In order to help the reader in the use of evolutionary strategies, we describe hereafter some practical rules in the choice of the most important parameters. 3.1. SIZE OF THE POPULATION There is so far no accurate rule to determine the size of the parent population. A good indication is to have as many or a few times as many members as the number of design variables. On the other hand, some theoretical studies have been realized on (1,λ) strategies [24] to estimate the optimal ratio λ/µ. It has been shown that this optimal ratio depends on the objective function and increases with its complexity. A ratio λ/µ equal to 5 can be considered as a good starting point. 3.2. STEP LENGTH CONTROL The standard deviation σi plays an important role as it permits to control the speed of convergence. As it also corresponds to the mean variation of the corresponding parameter, it is often called the step length. Like in all optimization procedures, the control of the step length is the most important part of the algorithm after the recursion formula. The theoretical study of the two membered evolutionary strategy [24] leads to the so-called success rule of Rechenberg, stated as After every nb (number of design variables) iterations, check how many successes have occurred over the preceding 10 nb mutations. If this number is less than 2 nb , multiply the step lengths by the factor 0.85; divide them by 0.85 if more than 2 nb successes occurred. For the initial value of the step lengths, Schwefel proposes to use the following estimation σ0i

 

∆bi nb

(7)

where ∆bi is the expected distance from the optimum for the corresponding design variable. The accuracy on this initial value is not critical as the success law seems to rapidly adapt the step length to a suitable value, at least when it is too large.

article.tex; 15/04/2002; 10:53; p.5

6

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

The success law itself doesn’t allow to change independently the standard deviations relative to each design variable, and consequently to adapt to the contours of the objective function. One possibility is to consider the standard deviation of each design variable as a supplementary gene, also likely to undergo some mutations, for example by multiplying it during the transition from parent to offspring by a random number with a mean value of 1. We have preferred here another alternative based on the recombination of the parents after mutation. Let us note that in this context, each initial parent has its own initial step lengths, chosen randomly in a given range. 3.3. COMMA OR PLUS STRATEGY Another option of the evolutionary strategies concerns the choice between comma or plus strategies. The plus strategy has the advantage to assure no worsening of the objective function. On the other hand, the comma strategy is recognized to have better adaptation properties with regard to step length. As the worsening probability becomes very low when the ratio λ/µ is large, Schwefel [24] recommends to use the comma strategy when this ratio is larger than 5. 3.4. RECOMBINATIONS The basic evolutionary strategies can be enriched by adding a supplementary step of recombination between the parents before mutation. Pairs of parents are randomly selected and are recombined to yield a new set of µ parents for mutation. According to the chosen type of recombination, the design variables or the standard deviations are determined as

 

the mean arithmetic or geometric value of the corresponding parameters of the two selected parents (arithmetic or geometric recombination); the corresponding parameter of one or the other parent, selected at random with equal probablity (discrete recombination);

Recombinations introduce the principle of sexual propagation which is expected to be very favourable for evolution as only few primitive organisms do without it. As mentioned before, it also offers the possibility to independently vary the step lengths of each design variable. 3.5. CONVERGENCE CRITERIA In the two membered strategy, the convergence criterion is based on the evolution of the best value of the objective function along generations. The optimum is assumed to be reached as far as the best value hasn’t significantly changed in the last generations.

article.tex; 15/04/2002; 10:53; p.6

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

7

With the multimembered strategy, the criterion still becomes simpler. The optimum is assumed to be reached as soon as the best individuals of a generation don’t differ too much with respect to their objective function values. If we denote ψ0 min and ψ0 max the minimal and maximal values of the objective functions inside a given parent generation, the iterative process will be stopped if   ψ0 min  ψ0 max



ε

(8)

A relative error criterion as  can also be used,  ψ0 min  ψ0 max



¯0 εr ψ

(9)

¯ 0 is the mean value of the objective function in the considered generwhere ψ ation. To avoid endless processes, it is also safe to impose a maximum number of iterations after which the optimization is automatically stopped, and eventually adapted for further investigations. 3.6. CONSTRAINTS Ideally, the initial parent population should hold the constraints. However, it can be difficult to find a uniformly distributed initial population which respects all the constraints. This condition can then be dropped, the constraints being taken into account through a penalty of the objective function as soon as a constraint is violated. In this way, the selection generally makes the parent population licit in a few generations. Although simple, this approach wastes computation time in the first generations and introduces the risk of having a population coming only from a small number of licit initial parents. The problem of finding an initial licit population is therefore often treated as a particular optimization problem whose cost function ψ0 corresponds to the sum of the violated constraints ψ0 b 

nc

∑ ψ j b   δ ψ j b !  

j 1

with δ x "#



(10)

1 if x 0  0 otherwise

If an illicit starting point arises during the construction of the initial population, the process is performed from that point, until all the constraints are satisfied. This initialization step is easy to implement, as it uses only available tools, and yields a well distributed population.

article.tex; 15/04/2002; 10:53; p.7

8

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

4. Kinematic Optimization - Suspension The evolutionary strategies will be first illustrated on the example of the shortlong arm front suspension represented in figure 1, taken from the Iltis vehicle described in [27].

b2 b1 b5

6

7

b6 b3 5

b4

Figure 1. Short-long arm suspension

It is composed of the lower and upper control arms and the steering arm connected to the wheel carrier and the body frame. We will here consider the classical problem of minimizing the variation of the toe angle during the vertical motion of the suspension, the steering wheel (node 3) being assumed to be fixed in the central position. Obviously, the wheel shouldn’t steer during suspension movements, when following a straight road. During curving, a different behaviour is eventually required to control the oversteer or understeer behaviour of the vehicle [17]. The 6 design variables correspond to coordinates of particular nodes of the suspension. They are detailed in table I with their variation range, the nodes being shown in figure 1. There are no time-dependent constraints. Table I. Suspension: design variables and their variation range lateral coordinate of node 6 vertical coordinate of node 6 lateral coordinate of node 5 vertical coordinate of node 5 lateral coordinate of node 7 longitudinal coordinate of node 7

0.1207 0.1830 0.1167 0.1167 0.0900 0.2965

$

$

$

$

$

$

b1 b2 b3 b4 b5 b6

$

$

$

$

$

$

0.2007 m 0.3030 m 0.1967 m 0.1967 m 0.2100 m 0.4165 m

An evolutionary strategy of type (10,20) with recombination has been used, the maximum number of iterations being fixed to 200. The initial step

article.tex; 15/04/2002; 10:53; p.8

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

9

lengths were chosen randomly between 0 and 1 for all parameters. The recombination was discrete for the design variables and arithmetic for the step lengths. These parameters are summarized in table II. Table II. Suspension: parameters of the evolutionary algorithm number of parents (µ) number of offsprings (λ) number of iterations type of strategy

10 20 200 (µ, λ)

σmin σmax recombination for b recombination for σ

0 1 discrete arithmetic

According to the distribution of the cost function among the population, the optimum was considered to be reached after 200 generations. The design variables of the best individual are given in table III. The optimization process required 4010 evaluations of the objective function and lasted around 45 minutes on a Pentium 500 MHz. Table III. Suspension: optimal set of design variables b1 = 0.193 m b2 = 0.302 m

b3 = 0.182 m b4 = 0.160 m

b5 = 0.204 m b6 = 0.405 m

The process has clearly decreased the extremum of the toe angle, whose evolution is shown in figure 2 before and after optimization. On the other hand, the shape of the curve remains the same as the chosen design variables don’t really allow to modify it.

5. Dynamic Optimization - Railway Vehicle The evolutionary strategy has been applied on the example of the railway vehicle depicted in figure 3, called the cityrunner and characterized by an intermediate suspended carbody. This particular design allows the riding in networks with very narrow curves, as in old European cities. As the central carbody, also called bridge car, is not directly related to the rail, its motion and consequently the comfort of the passengers are driven by the global dynamics of the vehicle, itself dependent upon global characteristics like mass distribution and suspension parameters. The problem is to find a design which satisfies or even optimizes comfort and stability of the vehicle. The mass distribution being constrained by other considerations, the remaining parameters are the suspension ones, in particular the yaw and lateral stiffness and damping coefficients of the bogie-carbody suspensions and the

article.tex; 15/04/2002; 10:53; p.9

10

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

Toe angle (rad) 0.25 0.2

optimum initial

0.15 0.1 0.05 0 -0.05 -0.1 -0.2

-0.15

-0.1

-0.05 0 0.05 0.1 0.15 Vertical displacement (m)

0.2

0.25

0.3

Figure 2. Evolution of the toe-angle

Figure 3. Railway vehicle cityrunner Table IV. Railway vehicle: design variables and their variation range yaw secondary stiffness, front bogie yaw secondary stiffness, rear bogie yaw secondary damper, front bogie yaw secondary damper, rear bogie yaw damper, front car articulation yaw damper, rear car articulation lateral secondary stiffness, front bogie lateral secondary stiffness, rear bogie lateral secondary damper, front bogie lateral secondary damper, rear bogie

5 $ b1 $ 5 $ b2 $ 5 $ b3 $ 5 $ b4 $ 5 $ b5 $ 5 $ b6 $ 100 $ b7 100 $ b8 5 $ b9 $ 5 $ b10 $

500 KNm/rad 500 KNm/rad 50 KNms/rad 50 KNms/rad 50 KNms/rad 50 KNms/rad $ 1000 KN/m $ 1000 KN/m 30 KNs/m 30 KNs/m

damping coefficients of the articulations between the cars. It is easy to imagine that the influence of these parameters is highly coupled so that it is very difficult to intuitively manage the process.

article.tex; 15/04/2002; 10:53; p.10

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

11

Table V. Railway vehicle: design variables and their variation range number of parents (µ) number of offsprings (λ) number of iterations type of strategy

10 30 100 (µ, λ)

σmin σmax recombination for b recombination for σ

0 1000 discrete arithmetic

Table VI. Railway vehicle: optimal solution b1 b2 b3 b4 b5

= 8.648 KNm/rad = 5.431 KNm/rad = 29.070 KNms/rad = 6.350 KNms/rad = 7.408 KNms/rad

b6 = 29.661 KNms/rad b7 = 32.022 KN/m b8 = 3.364 KN/m b9 = 18.539 KNs/m b10 = 29.809 KNs/m

The objective function corresponds to the sum of the lateral accelerations undergone by the centers of gravity of the carbodies while crossing a 90 degrees curve of radius 14 m, at the speed of 3.6 m % s. The design variables correspond to the suspension parameters. They are detailed in table IV, as well as their admissible range. The process is subjected to 2 time-dependent constraints, limiting the relative angle of the cars. It is of interest to note that this example constitutes an actual industrial problem which involves more design variables and constraints than in previous contributions based on classical deterministic methods. Moreover, the centrifugal forces during curving make the equations significantly nonlinear. The parameters of the chosen evolutionary strategy are similar to those of the previous example and are summarized in table V. The optimum was reached at the 98th iteration and led to the values given in table VI. The optimization process required 3010 evaluations of the objective function and lasted around 2.5 hours on a Pentium 500 MHz. The optimum objective function is 2.71 m % s2 . The evolution of the acceleration of the center of gravity of the bridge car is presented in figure 4 for the initial and optimal cases and shows how the optimization process has polished the acceleration peaks at the beginning and end of the curve. Moreover, the new design perfecly damps the lateral oscillations of the vehicle. Figure 5 shows the evolution of the objective function of the parent population with respect to the number of generations. At the beginning, the high values are due to the penalty that has been used to take the constraints into account. After less than 10 generations, all the individuals verify the constraints and the objective function slowly evolves to the optimum. The population is

article.tex; 15/04/2002; 10:53; p.11

12

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

Lateral acceleration of the bridge car (m/s2) 1.4 optimum initial

1.2 1 0.8 0.6 0.4 0.2 0 -5

0

5 10 15 20 25 Track distance from curve entry (m)

30

35

Figure 4. Evolution of the acceleration of the bridge car

Value of objective function (m/s2) 5 4.5 4 3.5 3 2.5 2 0

20

40

60

80

100

Generation Figure 5. Evolution of the objective function of the parent population

quite homogeneous as there are few variations of the objective function inside a given population. Figures 6 and 7 show two typical evolutions of design variables, corresponding respectively to the yaw stiffness of the front bogie and the yaw damping of the rear bogie. In figure 6, the population starts from values distributed between 5000 and 30000, regularly decreases until generation 50, and is then stabilized around a value of 10000, certainly forced by one of the constraints. The variation range also rapidly becomes limited. In figure 7, another behaviour arises: while oscillating around the value of 15000, the

article.tex; 15/04/2002; 10:53; p.12

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

13

Yaw stiffness of front bogie (Nm/rad) 30000 25000 20000 15000 10000 5000 0

20

40

60

80

100

80

100

Generation Figure 6. Evolution of the yaw stiffness of the front bogie

Yaw damper of rear bogie (Nms/rad) 30000 25000 20000 15000 10000 5000 0

20

40

60 Generation

Figure 7. Evolution of the yaw damping of the rear bogie

population progressively switches, between generations 40 and 60, to a mean value between 5000 and 10000. This corresponds likely to a jump to another local optimum, globally better according to the evolution of the other design variables. The variation is also larger, showing that the yaw damping of the rear bogie has less influence on the behaviour of the system.

article.tex; 15/04/2002; 10:53; p.13

14

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

Table VII. Optimization problems example shock absorber 5 d.o.f. vehicle suspension cityrunner

number of design variables 2 6 6 10

number of timedependent constraints 1 2 0 2

number of boundary constraints 4 12 12 20

6. Relative efficiency of optimization methods Deterministic methods, genetic algorithms and evolutionnary strategies have been systematically tested in [19]. We will give here the results obtained for the examples presented in this paper and for two problems classically treated in optimization of multibody systems: the shock absorber and the 5 degrees of freedom vehicle [2, 4]. Both examples, illsutrated in figure 8 are completely described in [2]. The main characteristics of the problems are recalled in table VII.

q6 V

suspension K,C

q7 K3 C3 q5 Kp Cp

q4 K1 C1

q2 q3 K2 C2 q1 Kp Cp

Figure 8. Schock absorber and 5 d.o.f vehicle

The efficiency of the optimization processes, in terms of optimal value and CPU time (Pentium 500 MHz), are given in table VIII. The considered deterministic method corresponds to the steepest descent algorithm, the sensitivities being determined by the technique of adjoint variables. The latter has been implemented only for dynamic cases, so that no results are available for the example of the suspension. The case of the cityrunner vehicle is different: the sensitivities could not be determined due to instabilities in the backward integration related to the determination of the sensitivities. The first author has shown [19] that this phenomemon may occur with highly damped vehicles. All methods were able to yield a reasonable optimum. In each case, the optimality zone can be considered to be reached. In some cases, stochastic methods outperform the deterministic methods, unable to switch to a another local optimum.

article.tex; 15/04/2002; 10:53; p.14

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

15

Table VIII. Performances of different optimization methods

example shock absorber 5 d.o.f. vehicle suspension cityrunner

deterministic ψ0 & bopt ' time (s) 0.5254 12 257.4 90 -

genetic ψ0 & bopt ' time (s) 0.5281 149 261 755 0.065 2403 2.80 9604

evolutionnary ψ0 & bopt ' time (s) 0.5206 840 254.2 308 0.069 2678 2.71 9053

As expected, the deterministic methods are faster as far as the convergence can be assured. It is particularly true for the simple example of the schock absorber but the gap decreases for the more complicated 5 d.o.f. vehicle. It can be expected that the efficiencies would become still closer if order-0 deterministic methods, needing only the evaluations of the cost and constraint functions, were used. It is also of interest to note that the computation times remain anyway reasonable.

7. Conclusion This paper presented the application of evolutionary strategies to the kinematic and dynamic optimization of multibody systems. The method has been described and two examples have shown its capabilities. Evolutionary strategies have been finally compared, in terms of efficiency, to a gradient-based method and genetic algorithms. The initial scepticism of the authors with regard to these strange stochastic methods rapidly became a real enthusiasm: they do not depend on any continuity and derivability conditions, they are able to switch to other local optima, and can use the simulation software as is. Genetic algorithms, which are another kind of stochastic method, were also tested by the authors and lead to comparable results. Both methods have been successfully used in other disciplines, like the optimization of structures, with complex cost and constraint functions and large numbers of design variables. Although the stochastic methods are slower than their classical gradient-based counterparts, the computational burden remains reasonable and the constantly increasing power of computers make them a credible alternative.

References 1.

Haug, E.J. and Arora, J.S., Applied Optimal Design, John Wiley & Sons, Chichester, UK, 1979.

article.tex; 15/04/2002; 10:53; p.15

16 2. 3. 4. 5. 6. 7. 8.

9.

10. 11. 12.

13.

14.

15.

16.

17.

18. 19.

20.

21. 22.

S. DATOUSSAID, O. VERLINDEN AND C. CONTI

Haug, E.J., Design sensitivity analysis of dynamic systems, NATO Series, Vol. F27, 1987. Bestle D. and Eberhard P., ’Analysing and optimizing multibody systems’, Mechanics of structures and machines, Vol. 20, No 1, 1992, 67-92. Kim, M.-S. and Choi D.-H., ’Multibody Dynamic Response Optimization with ALM and Approximate Line Search’, Multibody System Dynamics, Vol. 1, No 1, 1997, 47-64. Schulz, M., Mucke, R. and Walser H.-P., ’Optimization of Mechanisms with Collisions and Unilateral Constraints’, Multibody System Dynamics, Vol. 1, No 2, 1997, 223-240. Dias, J.M.P. and Pereira, M.S., ’Sensitivity Analysis of Rigid-Flexible Multibody Systems’, Multibody System Dynamics, Vol. 1, No 3, 1997, 303-322. Pedersen, N.L., ’Optimization of the Rigid Body Mechanism in a Wobble Plate Compresor’, Multibody System Dynamics, Vol. 1, No 4, 1997, 433-448. Hansen, M.R., ’An Efficient Method for Synthesis of Planar Multibody Systems Including Shape of Bodies as Design Variables’, Multibody System Dynamics, Vol. 2, No 2, 1998, 115-143. Ghasemi M.R. and Hinton, E., ’A genetic search based arrangement of load combinations in structural analysis’, Advances in Computational Structures Technology, Ed. Topping, B.H.V., Civil-Comp Press, 1996, 85-91. Ghasemi, M.R. and Hinton, E., ’Truss optimization using genetic algorithms’, Advances in Structural Optimization, Ed. Topping, B.H.V., Civil-Comp Press, 1996, 59-75. Jenkins, W.M., ’Structural optimization with the genetic algorithm’, The Structural Engineer, Vol. 64, 1991, 418-422. Miyamura, A., Kohama, Y. and Takada, T., ’Optimal allocation of shear wall at 3D frame by genetic algorithms’, Advances in Structural Optimization for Structural Engineering, Ed. Topping, B.H.V., pp 73-79, Civil-Comp Press, 1996. Rajeev, S. and Krishnamoorthy, C.S., ’Discrete optimization of structures using genetic algorithms’, Journal of Structural Engineering, Proceedings of the ASCE, Vol. 118, 1992, 1233-1250. Salajegheh, K., ‘Optimum design of plate and shell structures with discrete design variables’,Advances in Structural Optimization, Ed. Topping, B.H.V. and Papadrakakis, M., Civil-Comp Press, 1994, 187-193. Schütz, M. and Schwefel, H.-P., ’Evolutionary approaches to solve three challenging engineering tasks’, Computer Methods in Applied Mechanics and Engineering, Vol. 186, Issues 2-4, 2000, 141-170. Papadrakakis, M. Lagaros, N.-D., Tsompanakis, Y. and Plevris, V., ’Large scale structural optimization: Computational methods and optimization algorithms’ Archives of Computational Methods in Engineering, Vol. 8 (3), 2001, 239-301. Franchi, C. G., Migone, F. and Toso, A., ’Genetic algorithms in multi-link front suspension optimization’, 11th ADAMS European Users Conference, Mechanical Dynamics, 1996. Datoussaïd, S., ’Optimal design of multibody systems by using genetic algorithms’, Vehicle System Dynamics Supplement 28, 1998, 704-710. Datoussaïd S., Optimisation du comportement dynamique et cinématique de systèmes multicorps à structure cinématique complexe, PhD, Faculté Polytechnique de Mons (Belgium), 1999. Datoussaïd, S. and Verlinden, O. ’Application of Evolutionary Strategies to Optimal Design of Multibody Systems’, Advances in Computational Multibody Dynamics, Lisbon, 1999, 645-659. Schittkowski, K., ’Numerical optimization - Theory, methods and applications’, Numerical Analysis in Automotive Engineering’, VDI-Berichte, Vol. 816, 1990, 191-202. Bertsekas, D.P., Nonlinear Programming, Athena Scientific, 1995.

article.tex; 15/04/2002; 10:53; p.16

APPLICATION OF EVOLUTIONARY STRATEGIES TO MULTIBODY SYSTEMS

17

23. Holland, J.H.,Adaptation in Natural and Artificial Systems, The University of Michigan Press", 1976. 24. Schwefel, H.-P., Numerical Optimization of Computer Models, John Wiley, Chichester, UK, 1981. 25. Fogel D. B., ’An introduction to simulated evolutionary optimization’,IEEE Transactions on Neural Networks, Vol. 5, No 1, pp. 3-14, 1994. 26. Goldberg, D.E., Genetic Algorithms in Search, Optimization, and Machine Learning ", Addison-Wesley, 1989. 27. Kortum, W., Sharp, R.S. and de Pater, A.D., Application of Multibody Computer Codes to Vehicle System Dynamics, CCG (Society for Engineering and Scientific Education), 1991.

article.tex; 15/04/2002; 10:53; p.17

article.tex; 15/04/2002; 10:53; p.18