بهینه ساز دسته ذرات سلسله مراتبی با الگوریتم memetic بر اساس نمونه لاتین برای بهینه سازی عددی
|کد مقاله||سال انتشار||تعداد صفحات مقاله انگلیسی||ترجمه فارسی|
|10571||2013||14 صفحه PDF||سفارش دهید|
نسخه انگلیسی مقاله همین الان قابل دانلود است.
هزینه ترجمه مقاله بر اساس تعداد کلمات مقاله انگلیسی محاسبه می شود.
این مقاله تقریباً شامل 8330 کلمه می باشد.
هزینه ترجمه مقاله توسط مترجمان با تجربه، طبق جدول زیر محاسبه می شود:
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Applied Soft Computing, Volume 13, Issue 5, May 2013, Pages 2823–2836
Memetic algorithms, one type of algorithms inspired by nature, have been successfully applied to solve numerous optimization problems in diverse fields. In this paper, we propose a new memetic computing model, using a hierarchical particle swarm optimizer (HPSO) and latin hypercube sampling (LHS) method. In the bottom layer of hierarchical PSO, several swarms evolve in parallel to avoid being trapped in local optima. The learning strategy for each swarm is the well-known comprehensive learning method with a newly designed mutation operator. After the evolution process accomplished in bottom layer, one particle for each swarm is selected as candidate to construct the swarm in the top layer, which evolves by the same strategy employed in the bottom layer. The local search strategy based on LHS is imposed on particles in the top layer every specified number of generations. The new memetic computing model is extensively evaluated on a suite of 16 numerical optimization functions as well as the cylindricity error evaluation problem. Experimental results show that the proposed algorithm compares favorably with conventional PSO and several variants.
Optimization has been a research hotspot for several decades. Many real-world optimization problems in engineering are becoming increasingly complicated, so optimization algorithms with high performance are needed  and . Unconstrained optimization problems can be formulated as D-dimensional optimization problems over continuous spaceEvolutionary algorithms, inspired by natural evolution, have been widely used as effective tools to solve optimization problems. One class of nature inspired algorithms are swarm intelligent algorithms. Particle swarm optimizer (PSO)  and  has attracted attention in the academic and industrial community. Although PSO shares many similarities with evolutionary algorithms, the original PSO does not use the traditional evolution operators such as crossover and mutation. PSO draws on the swarm behavior of birds flocking where they search for food in a collaborative way. Each member, in the swarm, called a particle, represents a potential solution to the target problem and it adapts its search patterns by learning from its own experience and other members’ experience. The particle is a point in the search space and it aims at finding the global optimum which is regarded as the location of food. Each particle has two attributes called position and velocity and its direction of flight is adjusted according to the experiences of the swarm. The swarm as a whole searches for the global optimum in D-dimensional feasibility space. The PSO algorithm is easy to understand and implement, and has been proved to perform well on many optimization problems. However, it may easily get trapped in a local optimum for many reasons, such as the lack of diversity among particles and overlearning from the best particle found so far. To improve PSO's performance on complex numerical optimization problems, we propose a hierarchical PSO framework, in which several swarms evolve in parallel towards the global optimum and we design a new mutation operator to increase the diversity of swarms. After evolving for a specified number of generations, a latin hypercube sampling method is used to execute the local search. This paper is organized as follows. Section 2 introduces the original PSO and some variants. Section 3 describes the proposed hierarchical PSO with latin sampling based memetic algorithm, including four subsections: hierarchical PSO framework, mutation strategy, latin hypercube sampling based local search strategy and the overall framework of the proposed memetic algorithm. Section 4 gives the experimental results, describes the related parameter tuning process and compares the performance of the proposed algorithm on a suite of test problems to that of other PSO variants. Section 5 gives conclusions and describes future work.
نتیجه گیری انگلیسی
This paper presents a high performance memetic algorithm (MA-HPSOL) to deal with complex numerical optimization problems. Within the framework of the proposed algorithm, there are three main components: an hierarchical particle swarm optimizer for exploration, a local search method based on latin hypercube sampling for exploitation and a mutation operator using differential information. Concretely, the hierarchical PSO is composed of two layers: the bottom layer and the top layer. Particles in each swarm of the bottom layer evolve independently, which means each swarm is a niche with no influence on other swarms. Global best position in each swarm of the bottom layer becomes the candidate of the particle in the top layer, so the global best position in the swarm of the top layer steers the particles in each swarm of the bottom layer indirectly. The local search strategy, latin hypercube sampling, aims at exploiting the best solutions found so far uniformly. Both such exploration and the exploitation operators can help keep the diversity of whole population on a higher level to avoid particles’ trapping into local optima. Even if particles in one swarm are trapped in local optima, other swarms are also likely to reach the global optima. Furthermore, a mutation operator, aiming at modifying the particles’ positions based on differential information, is used. According to the experimental results on 16 functions, the proposed memetic algorithm (MA-HPSOL) has excellent performance to find global optimal solutions. MA-HPSOL is used to evaluate the cylindricity error and the experimental results show that it can obtain competitive performance as well. For our future work, two aspects, quantitatively depicting the diversity of the whole population and imposing mutual communication among swarms in the bottom layer, will be investigated in depth.