استراتژی های بالقوه تولید نتایج: الگوریتم ژنتیک بهبود یافته برای بهینه سازی عددی جهانی
|کد مقاله||سال انتشار||تعداد صفحات مقاله انگلیسی||ترجمه فارسی|
|10438||2009||13 صفحه PDF||سفارش دهید|
نسخه انگلیسی مقاله همین الان قابل دانلود است.
هزینه ترجمه مقاله بر اساس تعداد کلمات مقاله انگلیسی محاسبه می شود.
این مقاله تقریباً شامل 7985 کلمه می باشد.
هزینه ترجمه مقاله توسط مترجمان با تجربه، طبق جدول زیر محاسبه می شود:
- تولید محتوا با مقالات ISI برای سایت یا وبلاگ شما
- تولید محتوا با مقالات ISI برای کتاب شما
- تولید محتوا با مقالات ISI برای نشریه یا رسانه شما
پیشنهاد می کنیم کیفیت محتوای سایت خود را با استفاده از منابع علمی، افزایش دهید.
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Expert Systems with Applications, Volume 36, Issue 8, October 2009, Pages 11088–11098
In this paper, a sharing evolution genetic algorithms (SEGA) is proposed to solve various global numerical optimization problems. The SEGA employs a proposed population manager to preserve chromosomes which are superior and to eliminate those which are worse. The population manager also incorporates additional potential chromosomes to assist the solution exploration, controlled by the current solution searching status. The SEGA also uses the proposed sharing concepts for cross-over and mutation to prevent populations from falling into the local minimal, and allows GA to easier find or approach the global optimal solution. All the three parts in SEGA, including population manager, sharing cross-over and sharing mutation, can effective increase new born offspring’s solution searching ability. Experiments were conducted on CEC-05 benchmark problems which included unimodal, multi-modal, expanded, and hybrid composition functions. The results showed that the SEGA displayed better performance when solving these benchmark problems compared to recent variants of the genetic algorithms.
As more and more real-world optimization problems become increasingly complex, algorithms with more capable optimizations are also increasing in demand. Unconstrained optimization problems can be formulated as a N-dimensional minimization problem as follows: View the MathML source where N is the number of the parameters to be optimized. The genetic algorithm (GA) is use for moving from one population of chromosomes to a new population by employing a principle similar to Darwin’s “natural selection” together with the genetics-inspired operators of selection, cross-over, mutation, and inversion. The basic principles of GA were first introduced by Holland ( Holland, 1975 and Booker et al., 1978). Holland’s GA was the first evolutionary computation (EC) paradigm developed and applied. The fitter chromosome is likely to be selected for reproduction. The objective of cross-over is to randomly choose loci and exchange the subparts of chromosomes to create offspring. Mutation randomly flips the allele values of some locations in the chromosome; and inversion reverses the order of a contiguous section of the chromosome (Mitchell, 1996). The term chromosome typically refers to a candidate solution to a problem. Through the genetic evolution progress, genetic algorithms can search for solutions efficiently without derivative information; an optimal solution will be represented by the final winning chromosome in the genetic competition. Many researchers are devoted to developing new operations to enhance the optimal capacity of the original GA. There are several strategies for cross-over operation, such as two-point, multi-point, and uniform (Syswerda, 1989), etc., proposed for improving the efficiency of binary-coded GA. The extrapolation, interpolation (Michalewicz, Logan, & Swaminathan, 1994), and multi-cross-over (Chang, 2003) were proposed for enhancing the performance of real-coded GA. Mutation is one of the most significant and promising areas of investigation in evolutionary computation, since it is able to prevent the GA from falling into the local minimal. Hong and Wang have proposed a dynamical mutation to tune the rate about different types of mutation (Hong et al., 1996). Zhang et al. proposed a mutation GA with adaptive probability, and associated k-mean and fuzzy-based system to update that probability (Zhang, Chung, & Hu, 2004). Leung and Wang proposed OGA/Q (Leung & Wang, 2001) which utilized the characteristics of the orthogonal matrix and quantized searching space as several subspaces for generating offspring. A hybrid Taguchi genetic algorithm (HTGA) (Tsai, Liu, & Chou, 2004) was proposed by Tsai et al. The HTGA can efficiently generate better offspring and performed with better results than OGA/Q. Recently, Liu et al. proposed a fuzzy-based method to generate offspring (Liu, Xu, & Abraham, 2005). Different from the other methods of computational intelligence, such as fuzzy theory, artificial neural network, etc., GA has the ability of avoiding the local search and can increase the probability of finding the global best. It has been successfully applied to the fields of machine learning (Booker et al., 1978 and Gokdberg, 1989), numerical optimization (Jong, 1980, Muhlenbein et al., 1991 and Tu and Lu, 2004, signal processing (Guo and Mu, 2002, Li and Mao, 2004, Yue et al., 2002 and Zheng et al., 2004), etc. Although numerous variants of GA have been empirically shown to perform well on many optimization problems in the last three decades, issues with premature convergence when solving complex problems are still prominent in GA. In GA, solution exploration depends on new born chromosomes, which are generated by cross-over and mutation operations to have higher ability (valid information) of finding better solutions. When the solution searching comes to a standstill, superior (potential) chromosomes are required in order to break free from the local minimal, explore unsearched areas and speed up the solution searching progress. Thus, the quality of generated offspring will affect GA’s solution exploration ability directly. If most of the new born offspring, which have poor information, are eliminated by selection, the solution searching progress may slow down and even halt at a standstill. It is an important issue for parents to generate better offspring as means of speeding up the solution searching process. In order to efficiently drive the population and improve GA’s performance on complex multi-modal problems, the sharing evolution genetic algorithm (SEGA) is proposed. The SEGA is including population manager and sharing strategy, for enhancing the solution searching abilities and increasing offspring’s survival rate in genetic algorithms. The paper is organized as follows. Section 2 presents an overview of the genetic algorithm. Section 3 describes the sharing evolution genetic algorithm. Section 4 presents the test functions, experimental settings and results. Section 5 of the paper contains the conclusions.
نتیجه گیری انگلیسی
In this paper, the SEGA has been presented to solve global numerical optimization problems. The proposed population manager strategy can adjust population size according to its current solution searching state to ensure better (potential) chromosomes will join the evolution of GA. The proposed SEGA can operate with fewer elitist (potential) chromosomes in a population to find the optimal solution. Also, the sharing cross-over and sharing mutation can significantly increase the generation of potential offspring and improve chromosomes’ searching abilities, to aid in the search of the global optimal solution. It also makes GA more robust, prevents chromosomes from falling into the local minimum, and drives chromosomes more efficiently. All these factors can efficiently increase new born offspring’s survival rate which is succeeded by generating high quality offspring. Eighteen test functions with 30 dimensions, which selected from CEC 2005 benchmarks, were experimented. The experiments show that the proposed SEGA can get closer to optimal solutions, and it is more efficient than HTGA and OGA/Q on the problems studied.