evolutionary many-objective optimization: when evolutionary algorithms...

Post on 14-Oct-2020

5 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

School of Computer Science and TechnologyAnhui University, Hefei, 230601, China

Tuesday, April 16, 2019

Evolutionary many-objective optimization: when evolutionary algorithms facing optimization

problems with more than three objectives

安徽大学 张兴义

xyzhanghust@gmail.com

Group of Bio-Inspired Intelligence and Mining Knowledge (BIMK)IB M K

• Teachers

• Ph.D. Students

AnhuiUniversity

2 Ph. D. students and more than 30 master students

Xingyi Zhang(Prof) Fan Cheng(A/Prof) Yansen Su(A/Prof)

Ye Tian(A/Prof) Lei Zhang(A/Prof)Jianfeng Qiu(Lect) Chao Wang(Lect)

School of Computer Science and Technology

• Research Interests

AnhuiUniversity

R Multi-objective evolutionary algorithms and

their applications

R Complex networks clustering

R Unconventional computing models

R Evolutionary machine learning

OUTLINE AnhuiUniversity

I

II

III

IV

Many-objective evolutionary optimization

KnEA: many-objective optimization

LMEA: large-scale optimization

AR-MOEA: better versatility

SectionMany-objective evolutionary optimization

I

6

AnhuiUniversityOptimization problems in real world

• There are a large number of optimization problems in robotics, which contain multiple objectives to be optimized simultaneously.

[1] Castillo O, et al. Multiple objective genetic algorithms for path-planning optimization in autonomous mobile robots. Soft Computing, 2007, 11(3): 269-279.[2] Saravanan R, et al. Evolutionary multi criteria design optimization of robot grippers. Applied Soft Computing, 2009, 9(1): 159-172.

Path-planning optimization in autonomous robots [1]

Design optimization of robot grippers [2]

7

AnhuiUniversity

[3] Oh H, et al. Bio-inspired self-organising multi-robot pattern formation: A review. Robotics and Autonomous Systems, 2017, 91: 83-100.[4] Kim J H, et al. Evolutionary multi-objective optimization in robot soccer system for education. IEEE Computational Intelligence Magazine, 2009, 4(1):31-41.

Self-organizing multi-robot pattern formation [3] Robot soccer system [4]

Optimization problems in real world

8

AnhuiUniversityMulti-objective optimization problems (MOPs)

9

AnhuiUniversityMulti-objective optimization problems (MOPs)

• Which one is the best?

• All the choices are the best• A set of trade-off solutions are obtained, instead of single

solution

10

AnhuiUniversityEvolutionary algorithms

• Evolutionary algorithms (EAs) were widely used for solving MOPs, i.e., multi-objective evolutionary algorithms (MOEAs)

Why?

EA can evolve the solutions to achieve the global optimization

Reason 1

EA can obtain a set of solutions (called population)

Reason 2

EA can solve functions which are non-differentiable or discrete

Reason 3

11

AnhuiUniversityChallenges in solving MaOPs

• Enhance the performance of MOEAs in addressing different types of MaOPs (algorithm design)

• Evaluate the quality of obtained solution set of different MOEAs (performance metrics)

• Visualize the obtained solution set of different MOEAs (visualization)

• ......

SectionKnee point driven evolutionary algorithm for many-objective optimization problems

II

Zhang X, Tian Y, Jin Y. A knee point-driven evolutionary algorithm for many-objective optimization. IEEE Transactions on Evolutionary Computation, 2015, 19(6): 761-776.

13

AnhuiUniversity

• Existing Pareto based MOEAs designed for solving MOPs cannot work well on MaOPs

0

20

40

60

80

100

120

140

160

180

200

2 3 4 5 6 7 8 9 10

Number of Solutions

Number of Objectives

With the increasing of the number of objectives, more and more solutions become non-dominated

Knee point driven evolutionary algorithm (KnEA)

14

AnhuiUniversityTo solve this Problem

• We proposed a new MOEA called KnEA, which can obtain better results on most test problems, compared with other popular algorithms

15

AnhuiUniversityThe motivation of KnEA

• We choose the knee points among all the non-dominated individuals, since we think that they have better convergence than other non-dominated solutions.

16

AnhuiUniversityKnee point

• The knee points are a subset of Pareto optimal individuals for which an improvement in one objective will result in a severe degradation in at least another one

17

AnhuiUniversityAnalysis

18

AnhuiUniversityIdentify knee points

• We first find the extreme point of each objective, and calculate the hyperplane constituted by them.

• Then, we calculate the distance between each individual and the hyperplane, the individual has the largest distance is regard to a knee point, and it’s neighbors are ignored.

19

AnhuiUniversityThe framework of KnEA

20

AnhuiUniversityExperimental results I

21

AnhuiUniversityExperimental results II

SectionDecision variable clustering based evolutionary algorithm for many-objective large-scale optimization problems

III

Zhang X, Tian Y, Cheng R, et al. A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization. IEEE Transactions on Evolutionary Computation, 2018, 22(1): 97-112.

23

AnhuiUniversityMotivation of this work

• It is often ineffective to directly search the whole space when the number of decision variables is large.

• Inspired by the idea of divide-and-conquer, it is desirable to optimize several groups of decision variables separately.

• Few works have been reported on MOPs with large number of decision variables.

1. X. Ma, F. Liu, et al. A multiobjective evolutionary algorithm based on decision variable analyses for multi-objective optimization problems with large scale variables. IEEE Transactions on Evolutionary Computation, 2016, 20(2): 275-298.

24

AnhuiUniversityBasic idea of LMEA

• Divide the decision variables into several groups, and use different strategies to optimize each group.

Decision variables

Convergence-related variables Diversity-related variables

Subgroup Subgroup Subgroup

Variable clustering

Interaction analysis

Convergence optimization

Convergence optimization

Convergence optimization

Diversityoptimization

25

AnhuiUniversityDecision variable clustering

Perturb each variable of some solutions

several times

Characterize each variable by several angles (features)

Use K-means to divide all variables into two clusters

2 3

26

AnhuiUniversitySuperiority of decision variable clustering

27

AnhuiUniversityInteraction analysis

28

AnhuiUniversityConvergence & diversity optimization

• Convergence optimizationEvolve the convergence-related variables in each subgroup separately, and evaluate solutions based on their distance to the ideal point.

• Diversity optimizationEvolve all the diversity-related variables together, and evaluate solutions based on their angles to each other.

29

AnhuiUniversityExperimental results I

30

AnhuiUniversityExperimental results II

The performance of LMEA is very stable when the number of decision variables varies from 100 to 5000.

SectionAn indicator based multi-objective evolutionary algorithm with reference point adaptation for better versatility

IV

Tian Y, Cheng R, Zhang X, et al. An indicator based multi-objective evolutionary algorithm with reference point adaptation for better versatility. IEEE Transactions on Evolutionary Computation, 2018, 22(4): 609-622.

32

AnhuiUniversityMotivation of this work

• Most existing reference point based MOEAs work well on MOPs with regular Pareto fronts, but they encounter difficulties in solving MOPs with irregular Pareto fronts

Regular Irregular

33

AnhuiUniversityMotivation of this work

• The reference point based MOEAs is ineffective for irregular Pareto fronts due to the inconsistency between the distribution of the reference points and the Pareto front

Regular Irregular

34

AnhuiUniversityBasic idea of AR-MOEA

• We adapt the reference points for IGD-NS calculation to the shape of the Pareto front

35

AnhuiUniversityReference point adaptation

36

AnhuiUniversityExperimental results I

37

AnhuiUniversityExperimental results II

38

AnhuiUniversityConclusions and some further work

R Tian Y, Zhang X, Cheng R, et al. Guiding evolutionary multi- objective optimization with generic front modeling. IEEE Transactions on Cybernetics, 2018, in press.R Tian Y, Cheng R, Zhang X, et al. A strengthened dominance relation considering convergence and diversity for evolutionary many-objective optimization. IEEE Transactions on Evolutionary Computation, 2018, in press.

Three MOEAs have been reported for different types of many-objective optimization problems:

R Knee point driven evolutionary algorithm (KnEA) R Decision variable clustering based evolutionary algorithm (LMEA) R Indicator based multi-objective evolutionary algorithm with reference point adaptation (AR-MOEA) Further work:

39

AnhuiUniversityPlatEMO

2. https://github.com/BIMK/PlatEMO

1.http://bimk.ahu.edu.cn/bimk/index.php?s=/Index/Software/index.html

40

AnhuiUniversityAcknowledgements

• Thanks to the contributors of these works: Prof. Yaochu Jin (University of Surrey)• Dr. Ran Cheng (Southern University of Science and

Technology)• Dr. Ye Tian (Anhui University)• Dr. Fan Cheng (Anhui University)• Dr. Yansen Su (Anhui University)• The codes of these algorithms can be found from our

webpage BIMK: http://bimk.ahu.edu.cn/index.php?s=/Index/Software/index.html

4/16/2019

Xingyi Zhang

top related