Draft:Gradient-based repair differential evolution

  • Comment: The entire draft is virtually unsourced at time of review. Please add reliable inline citations before resubmitting. ~Liancetalk 19:27, 13 June 2024 (UTC)

G-DEmi applied to mixed-integer nonlinear programming problems shows its capacity to efficiently explore the discontinuous feasible regions (red lines).

The Algorithm G-Demi, an acronym for Gradient-based Differential Evolution for Mixed-Integer Nonlinear Programming, is a variant of the differential evolution designed to solve mixed-integer nonlinear programming problems (MINLP).[1] The presence of both continuous and discrete variables, along with constraints, leads to discontinuous feasible parts of varying sizes in the search space. Traditional evolutionary algorithms face difficulties with these problems due to their insensitivity in handling constraints, leading to the generation of many infeasible solutions. G-DEmi addresses this limitation by integrating a gradient-based repair method within the differential evolution framework. The aim of the repair method is to fix promising infeasible solutions in different subproblems using the gradient information of the constraint set.

G-DEmi

edit

G-DEmi continuously improves a population of candidate solutions through an iterative cycle of generation, evaluation, and selection of trial vectors. In each iteration, new vectors are generated by combining existing solutions. They are evaluated based on their performance and repaired as necessary to satisfy the constraints.

Initial Population

edit

The initial population   is generated by taking random values. For the real variables, random real values are generated, and for the integer variables, random integer values are generated, corresponding to the solution vector  . Subsequently, the objective function   and the degree of constraint violation  are evaluated.

Mutation and Crossover

edit

For each target vector  , a trial vector   is generated using mutation and binomial crossover ( ). The integer variables in   are rounded before evaluating the vector in the objective function and constraints.

Evaluation and Selection

edit

The trial vector is compared with its corresponding target vector, and the better one is selected according to the following feasibility rules:[2]

  1. Between two infeasible solutions, the one with lower constraint violation is preferred.
  2. If one solution is infeasible and the other one is feasible, the feasible solution is preferred.
  3. Between two feasible solutions, the one with better objective function value is preferred.

However, If the trial vector fails to improve its target but still has a lower objective function value  , and no other vector of the same subproblem has been repaired in the current generation , this trial vector is repaired.

Reparation and Improvement

edit

The better solution between the repaired vector and its target vector is passed to the population of the next generation  . Through these steps, G-DEmi generates a new population in each generation.

The following pseudocode illustrates the algorithm:

   algorithm G-DEmi Framework
   input:  ,  ,  ,  ,  ,  
   output: The best solution so far
   initialize the population  
   evaluate   and   for each individual in  
    
    
   while   do
        
       for each individual   in   do
           generate a trial vector  
           round the integer variables in  
           evaluate   and  
           if   is better than   then
               store   into  
           elseif   and   then
               repair   
               evaluate   and  
               if   is better than   then
                   store   into  
               end if
                
           end if
           update  
       end for
        
   end while

Gradient-based Repair Method

edit
 
Solution Repair in G-DEmi

The gradient-based repair method is a crucial component of G-DEmi, designed to address infeasibility in trial vectors generated by differential evolution operators. This method focuses on independently exploring subproblems defined by integer variables. Specifically, to repair a vector with mixed variables  , only the real variables   are modified while the integer variables   remain fixed.

The method repairs only those trial vectors   that satisfy two conditions: (i) they lost the tournament against their target vectors but have a better objective function value, and (ii) they belong to a subproblem   where no solution has been repaired in the current generation. These two conditions aim to promote the repair of trial vectors with higher potential and ensure that each subproblem is explored independently, avoiding the repair of similar solutions multiple times.

Definition of Constraint Violation

edit

The constraint violation   is defined as a vector that contains the violation degree for each inequality   and equality   constraint in a given problem, for a particular solution vector  . Parameters   and   denote the number of inequality and equality constraints, respectively, and   specifies the tolerance for equality constraints. The sign function   preserves the sign of the equality violation.

 

The Gradient Matrix of Constraints

edit

The gradient matrix of these constraints with respect to the  components of  , denoted as  , is defined as:

 

Finite Difference Approximation

edit

The forward finite difference approximation provides an estimate of these derivatives, defined as:

 

where   represents the step size and   is a unitary vector of the same dimension as  , with a value of 1 for the   component and 0 for the rest.

Repair Procedure

edit

This repair method aims to transform   into a feasible solution, which involves adjusting the elements of the vector   to zero. Iteratively, a repaired vector   can be obtained using Newton-Raphson's method through the following equation, which represents a linear approximation of   in the direction of the origin:

 

However, it is common that the number of variables differs from the number of constraints. In this case, the   matrix is non-invertible and the Moore-Penrose pseudoinverse must be used

 

Where   represents the pseudoinverse matrix of the gradient matrix  . A computationally efficient way of finding   is by employing singular value decomposition.

Mixed Variables Repair

edit

A mixed trial vector   is defined, where only the   component is updated during the iterative repair process. As a result, the constraint violation degree vector and the gradient matrix can be defined as:

 

 

The repair method follows these steps:

  Algorithm: Gradient-based repair method
  Input:  
  Output:  
  Initialize  .
  While none of the stopping criteria is fulfilled:
    Calculate   
    Calculate   
    Remove zero elements of   and their corresponding values in  .
    Calculate the pseudoinverse  .
    Calculate  
    Update  .
    Update  .
    Increment  .
  End While
  Stopping criteria:
   : Maximum number of iterations reached.
   : All elements of   are equal to zero.
   : Maximum absolute difference between   and   is equal to or lower than  .

Example of Mixed Variables Repair

edit

This repair procedure can be illustrated by the following example. Consider a scenario with one inequality constraint and one equality constraint, as shown below:

 

 

Suppose   and an equality tolerance  . In the first iteration (where  ),   and  . Therefore, the vectors   and   are computed as follows:  As you can see, only   was violated. Therefore, the element of   needs to be removed from   along with its corresponding values in  . This leads to  . Then,   and its pseudoinverse   are computed as follows: Subsequently, the vector   is obtained as follows: The updated vector   results in: As you can see, the values of   satisfy all the constraints. Therefore, the trial vector has been successfully repaired, and its new values are  .

References

edit
  1. ^ Molina-Pérez, Daniel; Portilla-Flores, Edgar Alfredo; Mezura-Montes, Efrén; Vega-Alvarado, Eduardo; Calva-Yañez, María Bárbara (2024-05-31). "Efficiently handling constraints in mixed-integer nonlinear programming problems using gradient-based repair differential evolution". PeerJ Computer Science. 10: e2095. doi:10.7717/peerj-cs.2095. ISSN 2376-5992. PMC 11157599. PMID 38855217.
  2. ^ Deb, Kalyanmoy (2000-06-09). "An efficient constraint handling method for genetic algorithms". Computer Methods in Applied Mechanics and Engineering. 186 (2): 311–338. Bibcode:2000CMAME.186..311D. doi:10.1016/S0045-7825(99)00389-8. ISSN 0045-7825. Retrieved 2024-06-12.

Category:Evolutionary algorithms Category:Evolutionary computation Category:Metaheuristics Category:Optimization algorithms and methods Category:Bioinspiration