Abstract:
In this paper ,through a promoted Hooke-Jeeves operator embedded into the genetic algorithm,the method that combined the advantages of the genetic algorithm and the Hooke-Jeeves algorithm which need not the optimal problem function’s differential and promote the ability of the genetic algorithm’s locally meticulous research can be got with the faster convergence and the greater probability for the global solution.he numerical computing results show that it is distinctly superior to the two algorithm above.Keywords: Computational intelligent ,Indifferential function ,Genetic algorithm, Hooke-Jeeves algorithm
There are many ways that solve nonlinear and Indifferential function’s optimal problem ,We have many traditional ways such as Hooke-Jeeves algorithm, Rosenblock algorithm and so on. However these methods
are all for local extremum and can solve local extremum , not for global extremum .
Recently genetic algorithm that imitates living being’s evolution absorbed study staff in different kinds of domains and has been applied extensively all kinds of fields that include function optimize ,mode identification
,image dealing ,manual intelligence and so on. Genetic algorithm only need the value of the optimal object’s
function, towards the optimal problem for the indifferential and even uncontinuous function,Genetic algorithm can have the biggest probability to get the most optimal solution globally .Calculating time is relatively less , it possesses much better characteristic such as better stability , adaptability and collateral and so on.In the reference literature[2] , the author suggests a method that towards continuous and differential function ,by a newton operator embedded into the genetic algorithm to solve the optimal problem ,The method got good effect but can not been applied for indifferential function problem .
The paper proposes a hybrid algorithm that put Hooke-Jeeves operator embedded into the Genetic
algorithm .In order to improve the convergence velocity and research probability of indifferential function,
Hooke-Jeeves can be called as pace accelerated algorithm, which is made up of exploration motion and mode
motion, The former reveal the object function’s changing disciplinarian and follow the avail direction (approximately gradient direction) to seek preferable point. The method can be regarded as the most fast
descending way, As a directly optimal method for indifferential function .The method is the most effective way
and combines the advances of both the genetic algorithm and the Hooke-Jeeves algorithm. The hybrid algorithm not only can have more probability to seek the most optimal solution ,but also carry on locally meticulous search.
2.problem description
Calculating complicate object’s optimal problem of nonlinear and indifferential function ,we can show
In (1) ai and bi are the upper value and the lower value of variable x ,n is the dimension of variable
X ,f(x) is the nonlinear and indifferential function. In the most practical condition , towards a research plant
function, Giving a independent variable, we can get a corresponding function value, Except it we can not get
whether the function is continuous and differential or not.
2.1 Hooke-Jeeves algorithm
Hooke-Jeeves algorithm is a directly optimal method, which is simple and briefness , need not the optimal problem function’s differential.
Algorithm:
Step1 initialization: setting the original point x(1)
∈E n and n coordinate directionsE1 ,E2 ,E3 ,…,En and original pace
δ ,accelerated factor а≧1 ,reducing rate β∈(0,1),tolerateerror ε>0 ,set y(1)=x(1), k=1,j=1.
Step2 if f(y(j)+δEj)< f(y(j)),then put y(j+1)=y(j)+δEj ,carry on step 4; else carry on step 3.
Step3 if f(y(j)-δEj)< f(y(j)),then put y(j+1)=y(j)-δEj ,carry on step 4; else put y(j+1)=y(j) ,then carry on step 3.
Step4 if j < n , then set j = j+1 ,return to step 2 ; else carry on step 5.
Step5 if
f(y(n+1) )<f(xk) , then carry on step 6 ; else carry on step 7Step6 set x(k+1) = y(n+1), let y(1)=x(k+1)+a(x(k+1)-x(k)) and k=k+1 , return to step 2
Step7 if
δ≦ε then halt the iterate and get the point x(k) ,else put δ=β*δ .Hooke-Jeeves algorithm is a local directly optimal method , which can be regarded as the most fast descending way approximately. As the method need not calculate the differential of the plant function.
It is fitted for the optimal of the indifferential function.
2.2 Genetic algorithms
Genetic algorithm is a simulation of evolution where the rule of survival of the fittest is applied to a population of individuals. It transfers the aim function into gene groups ,regards adapt function as optimal
target; by the recombine ,exchange and mutation of gene , In general, the fittest individuals of any population tend to reproduce and survive to the next generation , thus improving successive generations. however ,inferior
individuals can, by chance ,survive and also reproduce. we can continually get optimal gene combination of next generation, till the final optimal result is got. Numerical calculation evinces that the method can cross the local extremum and make the group migrate to the global optimal value .The characteristic comes from internal gene diversity of Genetic algorithm, makes algorithm research in the full directions.
The basic genetic algorithm is as follows:
step 1. Create an initial population (usually a randomly generated string) .
step 2. Evaluate all of the individuals (apply some function or formula to the individuals).
step 3. Select a new population from the old population based on the fitness of the individuals as given by the evaluation function.
step 4. Apply some genetic operators (mutation & crossover) to members of the population to create new solutions.
step 5. Evaluate these newly created individuals.
step 6. Repeat steps 3-6 (one generation) until the termination criteria has been satisfied (usually
perform for a certain fixed number of generations).
To solve optimal problem, genetic algorithm has its inherent malpractice. Because it usually uses binary
code which induces contradiction of the calculating precision and the length of the character string and operates full randomly with probability , it makes the ability of locally research of genetic algorithm is poor. However , Hooke-Jeeves algorithm possesses doughty local research ability and improve the velocity of convergence greatly. Hybrid algorithm combines the advantages of the genetic algorithm and hooke-Jeeves algorithm and make full of Hooke-Jeeves algorithm which need not the optimal problem function’ s differential and promote the ability of the genetic algorithm’ s locally meticulous research. In this paper , we embedded the a promoted Hooke-Jeeves operator into the step 3 of genetic algorithm in order to improve the
convergence velocity and probability of solution .hybrid algorithm is as follows:
step 1. Create an initial population (usually a randomly generated string) .
step 2. Evaluate all of the individuals (apply some function or formula to the individuals).
step 3. Add father generation and filial generation into new generation, according to probability Pch ,carry on Hooke-Jeeves research; Select a new population from the old population based on the fitness of the individuals as given by the evaluation function.
step 4. Apply some genetic operators (mutation & crossover) to members of the population to create new solutions.
step 5. Evaluate these newly created individuals.
step 6. Repeat steps 3-6 (one generation) until the termination criteria has been satisfied (usually
perform for a certain fixed number of generations).
Hooke-Jeeves research operator is as follows:
Do {
Towards the i of the all individuals produce the random value between 0 and 1,
If the random value is greater than established Hooke-Jeeves research probability Pch,
Then carry on operation as follows:
Do{
Carry on exploration motion and mode motion
} while(the required precision or iterate time is satisfied)
generated filial generation replaces father generation and join the filial generation.
}while(entire group is researched)
In the algorithm, adaptation function is defined as g(x) = fmax
- f(x)+k*(fmax-fmin),where fmax and fmin is separately the maximal and minimal function value of the present group, k is the parameter which controlsthe rate of the present group
’s maximal and minimal adaptation value. Because each generation’s fmac and fmin vary with the group in the algorithm, the adaptation function above makes the hybrid algorithmhas adaptability and robust in the aspect of the decision of adaptability and selection probability .
The paper adopts the hybrid data structure ;when the individual is confirmed to carry on mutation and crossover operation, coding will be done. The filial generation produced after calculation should be decoded.
The advantage for adopting hybrid data structure is to avoid effect on precision caused by limited length of words and make full use of briefly local research ability of Hooke-Jeeves algorithm. In the above algorithm ,
the Hooke-Jeeves research operator added into the step 3 is mainly used to carry on directly numerical optimal Hooke-Jeeves research operation for the traditional indifferential function ,In every reproductive filial
,according to probability Pch, whether Hooke-Jeeves research operation is carried on or not is judged.
Because Hooke-Jeeves algorithm is a favorable local extremum optimal algorithm that the iterate point sequence make the function value descend monotonously, new filial group which have been optimized by Hooke-Jeeves algorithm develop the favorable quality of the father’s generation, accordingly the new filial generation can become the next group instead of the father’ group directly.
4.citing and calculating
In this paper, the following optimal problem of the indifferential function is consideredWhere xI is belonging to [–1,1] , int serves as integer operation that counts fractions over five as ten and disregards the rest ,rounds to the nearest. To add integer operation into the equation is to add indifferential
operator. When we remove the indifferential operator and let k is zero, the equation is the famous numerical
powell function of globally optimal problem, the global extremum point x* of the function is (0,0,0,0), where the function value f(x*) is zero and two order partial differential matrix is singularity. when k isn’t zero , the
equation above adds local extremum points densely into powell function, the difficult of solution is added.
In this paper, a number of numerical operation is made about the above optimal problem. Firstly three thousand groups that have four dimensions in each group is produced, then every thirty groups is regarded
as initial group, one hundred calculations is made at all. We solve the above problem separately with pure
Hooke-Jeeves algorithm , genetic algorithm and hybrid algorithm in the following.
4.1 result of Hooke-Jeeves algorithm:
when we get the initial step δ is 0.005 , reducing rate β is 0.5, permissible error is 0.0001,
the most iterate number is 500:
when k is zero, the monotony of function is approximately good ,we can get the extremum point with Hooke-Jeeves algorithm by 100% .when x* is (0,0,0,0) ,f(x*) is 0.
when k is eight, the equation above adds many local extremum points densely into powell function,
the global solution is got only by 7% probability.
The numerical calculation above shows that Hooke-Jeeves algorithm as a directly local research algorithm
possesses locally doughty research ability, but global research ability is poor.
when we respectively let crossover probability be 0.8 ,mutation probability be 0.05 ,the biggest propagation
number is 10000.
when k is zero , the global solution is got by 100% probability.
when k is eight , the global solution is got only by 90% probability.
The calculation example shows that genetic algorithm is a globally optimal algorithm that has biggish probability to get the global solution. however, under the condition of this example, full probability solution
can not be reached .As a matter of fact we can adjust the above parameter to get the global solution with the
full probability, but usually endless iterate course needs to be passed , a number of time will be consumed,
solving efficiency is very low. Considering the local research ability of genetic algorithm isn’t powerful, we can
make full use of the locally delicate research ability, improve the step 3 of the genetic algorithm by adding the
Hooke-Jeeves algorithm into it, so solving efficiency and solving probability can be heightened greatly.
Because the extremum points gather up densely in the interval of [-1,1], to discuss the local research ability of the Hooke-Jeeves operator in the hybrid algorithm ,we separately let the initial step δ be 0.005 , reducing rate β is 0.5, permissible error is 0.0004; let the initial step δ be 0.0005 , reducing rate β is 0.5, permissible error is 0.0004; let the initial step δ be 0.001 , reducing rate β is 0.5, permissible error is 0.0002;the most iterate number is 500, crossover probability Pc is 0.8, mutation probability Pm
is 0.05.
table 1 the result of calculation
initial step δ |
probability Pch |
||||||||
0.0 |
0.005 |
0.01 |
0.015 |
0.02 |
0.025 |
0.03 |
0.035 |
||
K=8 |
δ=0.005 |
50 |
75 |
75 |
92 |
96 |
97 |
99 |
100 |
probability to solve |
50% |
75% |
75% |
92% |
96% |
97% |
99% |
100% |
|
K=8 |
δ=0.0005 |
50 |
78 |
78 |
94 |
96 |
98 |
100 |
100 |
probability to solve |
50% |
78% |
78% |
94% |
96% |
98% |
100% |
100% |
|
K=8 |
δ=0.001 |
50 |
72 |
87 |
95 |
98 |
100 |
100 |
100 |
probability to solve |
50% |
72% |
87% |
95% |
98% |
100% |
100% |
100% |
Comments:
1).In the calculation above, we assume if f(x)
≦10-5 , the global solution is thought to be grabbled.2). From table 1 ,when Pch is zero, the hybrid algorithm becomes the normal genetic algorithm; when Pch is not zero, the solving probability of hybrid is obviously higher than the normal genetic algorithm’s. when we
properly adjust the initial step
δ and permissible error, the solving probability could be improved;when Pch is greater than 0.035 and the propagation generation is only 500,the global solution can be got with
the full probability.
3).if we let the initial step
δ be 0.001 and permissible error be 0.0002,when Pch is only 0.025, the global solution can be got with the full probability by means of the action of both Hooke-Jeeves operator andgenetic algorithm
.5.the conclusion:
In accordance with the globally optimal problem of indifferential and nonlinear function, this paper
puts forwards a hybrid genetic algorithm based on
Hooke-Jeeves algorithm. the algorithm synthesisthe advantages of the globally optimal aspect of the genetic algorithm and the characteristic of
locally delicate research of
Hooke-Jeeves algorithm, meanwhile the differential aspect information ofthe optimal function is not needed, by means of improving the step 3 of the genetic algorithm and add the
Hooke-Jeeves operator into it, the solving probability and the solving efficiency can be heightened greatly.
The result of simulation shows that the hybrid algorithm is obviously better than the genetic algorithm and
Hooke-Jeeves algorithm
.