Site Loader

Abstract
We all optimize our daily lives, even without reasoning, steadily and almost magically. Optimization has become a fundamental part of our existence, which cut across diverse domains. However, optimizing isn’t as easy as it appears, because of the difficulty of determining the best solution from all feasible solutions. This has led to numerous researches being conducted in the area of optimization over time. Differential Evolution which is a population-based meta-heuristics is the focus of this study. This study includes the motivation for DE, DE algorithm, advantages and disadvantages of DE, applications of DE and the future of DE algorithm.

Section One
Introduction
Humanity is intrinsically tied to the desire to excel, either consciously or not, therefore, making optimization a basic part of our lives. It is noteworthy that optimization takes different forms in different fields of endeavour. In the field of computer science, optimization is seen as the desire to increase the efficiency of a system, such as the design of an efficient erasure code in order to reduce numerous errors by adding little redundancy during data transmission.

Price et al. (2005) therefore, defined optimization as the attempt to maximize a system’s desirable properties while simultaneously minimizing its undesirable characteristics. Optimization as defined by Merriam-Webster dictionary is “an act, process, or methodology of making something (such as a design, system, or decision) as fully perfect, functional, or effective as possible; specifically: the mathematical procedures (such as finding the maximum of a function) involved in this.” Du & Swamy(2016) also defined optimization as the process of searching for the optimal solution.

The difficulty of determining the best solution from all feasible solutions is the reason many researches have been conducted over the years on optimization. As a result, different optimization algorithms were introduced such as Brute Force Search, Random Walk, Simulated Annealing, Hooke and Jeeves, Nelder and Mead, Controlled Random Search, Evolutionary Strategies, Genetic Algorithms, Differential Evolution and so on evolving over the course of time. In this work, the focus is on Differential Evolution.

Differential Evolution(DE) is a population-based optimization algorithm(Jegakumar & Velayutham 2011), whereby a big and naturally complex process of evolution can be solved using a small and simple mathematical model(Feoktistov 2006). It is a dependable and versatile optimizer that is also easy to use(Price et al., 2005). DE algorithm has proven itself in several contest such as the First International Contest on Evolutionary Optimization in May 1996 held in conjunction with the 1996 IEEE International Conference on Evolutionary Computation (CEC); the First International Contest on Evolutionary Optimization (1st ICEO) held in Nagoya, Japan; and several other contest testing the efficiency of this optimization algorithm(Das & Suganthan 2011).

DE has proven to be an efficient optimization technique, which has room for further improvement in a wider domain. This work gives an insight on DE algorithm, the future of DE, advantages, disadvantages and applications of DE.

Section Two
Literature Review
Optimization is the process of increasing the efficiency of a system by maximizing the desirable properties of a system while reducing its undesirable properties. For example, reducing the distortion in a radio station’s signal, is an optimization problem. Research as proven that most real world optimization problems are both fundamentally and practically difficult, hence, there is a need to continually improve optimization algorithms in order to achieve optimal solutions(Arunachalam 2008).

It is of essence that a practical optimization technique should converge fast, should be easy to use by having a minimum number of control parameters and regardless of the initial system parameters, it should be able to find the global minimum(Storn ; Price 1995). As a result of this requirements, Storn ; Price(1995) developed an optimization method called Differential Evolution. A new methodology for generating trial vectors is the fundamental idea behind DE, that is achieved by summing the scaled difference of two randomly selected vectors to a third vector, this is illustrated in the figure below.

Fig. 2: Illustrating the fundamental idea behind DE(Price et al. 2005).
Before the emergence of DE, several optimization algorithms were being used which will be reviewed in this section such as Simulated Annealing, Nelder and Mead, Random Walk, Hooke and Jeeves, Controlled Random Search, Brute Force Search and so on.

2.1 Brief History Of DE
Rainer Storn and Kenneth Price proposed an optimization method called DE in 1995(Storn ; Price 1995, Qing 2010). This was as a result of the Chebyshev Polynomial Fitting probem Storn wanted Price to solve using his Genetic Annealing algorithm(Feoktistov 2006). However, the problem was solved using Genetic Annealing, but was discouraging as it did not fulfil the three basic requirements for optimization techniques which are fast convergence, ease of use and global optimum solution. The demerits of Genetic Annealing led to the discovery of differential mutation after several recast was done on it. After a while, the annealing mechanism was removed which led to the era of DE after it was detected that the combination of differential mutation with discrete recombination and pairwise selection does not require the annealing factor(Feoktistov 2006).

2.2 Different Optimization Techniques
A lot of optimization techniques exists in literature, but, more light will be thrown on a few of them in this section:
2.2.1

Section Three
Differential Evolution Algorithm
DE is an evolutionary optimization algorithm that is classified primarily as a numerical optimizer. Regardless of the type of a variable, DE sees it as a floating-point value. There are different variants of DE which differs in how new solutions are being generated (Arunachalam 2008). DE/x/y/z is the general convention used in representing the different variants of DE. Where,
DE – Differential Evolution
x – string denoting the vector to be perturbed
y – number of difference vectors considered for perturbation of x
z – type of crossover being used (exp:exponential, bin:binomial)
This variants can be adopted based on the problem to be solved(Onwubolu ; Davendra 2004). According to Onwubolu ; Davendra (2004), Price et. al. (2005), Menzura-Montes et. al. (2006) and Jegakumar ; Velayutham (2011), different variants of DE can be seen which follows the standard DE nomenclature such as:
1.DE/rand/1/bin
2.DE/rand/1/exp
3.DE/best/1/bin
4.DE/best/1/exp
5.DE/rand/2/bin
6.DE/rand/2/exp
7.DE/best/2/bin
8.DE/best/2/exp
9.DE/current-to-rand/1/bin
10.DE/current-to-rand/1/exp
11.DE/current-to-best/1/bin
12.DE/current-to-best/1/exp
13.DE/rand-to-best/1/bin
14.DE/rand-to-best/1/exp

DE algorithm basically involves four basic steps:
Initialization Mutation Recombination Selection

After the installation of the new population, the process of mutation, recombination and selection is repeated until:
I.the optimum is located
II.a pre-specified termination criterion is satisfied e.g., number of generation reaches a predetermined maximum, gmax.

Fig. 3.1. Working of DE Algorithm (Arunachalam 2008).
1.Initialization
DE begins with an initial population with a specified upper and lower bound denoted as bL and bU. It consists of a population of NP D dimensional real-valued parameter vectors. Within the prescribed range of the initialization bounds, a value is assigned to each parameter vector by a random number generator. Equation 3.1 represents the initial population of the jth parameter of the ith vector, when the generation (g) is zero.
Xj,i,0 = randj(0,1).(bj,U – bj,L) + bj,L (3.1)

Fig. 3.2 Initialization(Price et al. 2005).
2.Mutation
After initialization, three parameter vectors are selected randomly, and the scaled difference of two of the vectors is added to the third vector, thereby, creating a mutant vector vi,g:
vi,g = Xr0,g + F.(Xr1,g – Xr2,g) (3.2)

Fig. 3.3. Mutation(Price et. al. 2005)
The mutation factor F is a constant from 0, 2, it controls the rate at which the population evolves.
3.Recombination
At this stage, a new vector (ui,g) “trial vector” will be generated, by randomly changing the donor vector information with the target vector:
(3.3)
4.Selection
Here, the Trial vector and Target vector are being compared. The target vector, xi,g, is replaced in the next generation by the trial vector, ui,g, if the trial vector has an equal or lower objective function value.
(3.4)

Fig. 3.4. Selection (Price et. al. 2005)

DE Algorithm
1.Generate P = (x1, x2,…, xNp)
2.Repeat:
a.for i = 1 to Np do
i.compute a mutant vector vi
ii.create ui by the crossover of vi and xi
iii.if f(ui) ; f(xi) then
insert ui into Q
else
insert xi into Q
end if
end for
b.P Q
until stopping criteria is satisfied

Section Four
Advantages, Disadvantages and Applications of DE
4.1Advantages of DE
Advantages of DE as seen in price et. al. (2005), Feoktistov (2006) and Tonge ; Kulkarni (2012):
1. Few parameters setting required – Np, F, Cr
2. Searches randomly
3. It has high performance
4. It can be accessible for practical applications
5. It is easy to use
6. simple concept involved
7. speed to get solutions

4.2 Disadvantages of DE
DE however has it’s drawback (Tonge & Kulkarni 2012), such as:
1. It is easy to drop into regional optimum
2. It has unstable convergence.

4.3 Applications of DE

Section Five
The Future of DE Algorithm

Section Six
Conclusion

Section Seven
References
Price K.V., Storn R.M. & Lampinen J.A. (2005) Differential evolution- a practical approach to global optimization.(Rozenberg G., Back T.H., Eiben A.E., Kok J.N. & Spaink H.P., eds), Leiden Center for Natural Computing, Leiden University, Netherlands, pp. 18 – 538.

Storn R.M. & Price K.V. (1995) Differential evolution- a simple and efficient adaptive scheme for global optimization over continuous spaces. International Computer Science Institute, 1947 Center Street, Berkeley, 12pp.

Du K. & Swamy M.N.S. (2016) Search and optimization by meta-heuristics: techniques and algorithms inspired by nature. Springer International Publishers, Switzerand, pp. 1 – 23, 93 – 104.

Onwubolu G. & Davendra D. (2004) Scheduling flow shops using differential evolution algorithm. European Journal of Operation Research 171 (2006) 674 – 692.

Jeyakumar G. & Velayutham C.S. (2011) Convergence analysis of differential evolution variants on unconstrained global optimization functions. International Journal of Artificial Intelligence and Applications 2(2), 12pp.

Mezura-Montes E., Velazquez-Reyes J. & Coello C.A. (2006) A comparative study of differential evolution variants for global optimization. Seattle, Washington, U.S.A. 8pp.

Arunachalam V. (2008) Optimization using differential evolution. The University of Western Ontario, London, Ontario, Canada, 42pp.

Feoktistov V. (2006) Differential evolution – in search of solutions. (Pardalos P.M. & Du D., eds), United State of America.

Das S. & Suganthan P.N. (2011) Differential Evolution – a survey of the state-of-the-art. IEEE Transactions on Evolutionary Computation 2(1), 41pp.

Tonge V.G. & Kulkarni P.S. (2012) Performance improvement of differential evolution – a survey. International Journal of Current Engineering and Technology 2(4), 5pp.

Post Author: admin