a sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

22
A Sensitivity Analysis of Contribution-Based Cooperative Co-evolutionary Algorithms Borhan Kazimipour Mohammad Nabi Omidvar Xiaodong Li A.K. Qin

Upload: borhan-kazimipour

Post on 22-Jul-2015

18 views

Category:

Data & Analytics


0 download

TRANSCRIPT

Page 1: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

A Sensitivity Analysis of

Contribution-Based Cooperative Co-evolutionary Algorithms

Borhan KazimipourMohammad Nabi OmidvarXiaodong LiA.K. Qin

Page 2: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundLarge-Scale Black-Box Optimization

• Optimization:

– �∗ � argmin�∈�

����

• Black-Box Optimization:

– � is unknown.

– Most of mathematical models cannot be applied (as they have assumptions on �)

• Large-Scale Problems:

– is very large (i.e., 1000) and computational budget is limited

– Can be interpreted as a form of curse of dimensionality that results in significant performance drops dramatically

CEC 2015, Sendai, Japan 2Sensitivity Analysis of CBCC Algorithms

Page 3: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundCooperative Co-evolution

• Cooperative Co-evolutionary (CC) EAs:

– Follow the famous divide and conquer approach

– � � ∑ ������

• Procedure:

– Decomposition: Divide � into a set of smaller problems (��).

– Credit Assignment: Divide the allocated computational budget equally to all components.

– Optimization: Optimize each component almost separately.

– Merge: Merge all sub-solutions to form a solution for �

CEC 2015, Sendai, Japan 3Sensitivity Analysis of CBCC Algorithms

Page 4: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundCooperative Co-evolution

• Cooperative Co-evolutionary EAs:

– Follow the famous divide and conquer approach

– � � ∑ ������

• Procedure:

– Decomposition: Divide � into a set of smaller problems (��).

– Credit Assignment: Divide the allocated computational budget equally to all components.

– Optimization: Optimize each component almost separately.

– Merge: Merge all sub-solutions to form a solution for �

CEC 2015, Sendai, Japan 4Sensitivity Analysis of CBCC Algorithms

Are all the components equally contribute to the objective function?

Page 5: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundCooperative Co-evolution

• Cooperative Co-evolutionary EAs:

– Follow the famous divide and conquer approach

– � � ∑ ������

• Procedure:

– Decomposition: Divide � into a set of smaller problems (��).

– Credit Assignment: Divide the allocated computational budget equally to all components.

– Optimization: Optimize each component almost separately.

– Merge: Merge all sub-solutions to form a solution for �

CEC 2015, Sendai, Japan 5Sensitivity Analysis of CBCC Algorithms

Are all the components equally contribute to the objective function?

Page 6: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundImbalanced Problems

• Imbalanced Problems:

– Different components/sub-components may have different levels of contribution to the objective function.

• Sources of Imbalance:

– Different landscape (basis function)

– Different dimensionalities

– Forexample: �� � �� where� �

– Different weights

– Forexample: � � ∑ !������� , #$%&%!� � ! forall� �

• Challenge:

– For the best result, the most contributing components should receive bigger chunks of computational budget.

CEC 2015, Sendai, Japan 6Sensitivity Analysis of CBCC Algorithms

Page 7: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundContribution-Based CC

• Contribution-Based Cooperative Co-evolutionary (CBCC)[1]:

– Goals:

– Estimate the contribution of each component in the improvement to objective value.

– Assign proper portion of computational budget to components according to their contribution.

– Assumptions:

– The budget is limited

– The problem is partially separable

– Challenges:

– Contribution of components may be unknown to the practitioners/algorithms

– Contributions may change during the optimization

[1] M. N. Omidvar, X. Li, and X. Yao. "Smart use of computational resources based on contribution for cooperative co-evolutionary algorithms." In Proceedings of the GECCO’11

CEC 2015, Sendai, Japan 7Sensitivity Analysis of CBCC Algorithms

Page 8: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

BackgroundContribution-Based CC

• CBCC1 [1]

1. Exploration: Optimize all components for one cycle each and measure improvements in fitness values (∆��)

2. Exploitation: Optimize the most contributing component (�∗ � arg max�∈ �..�

∆��) for

only one extra cycle.

3. Repeat the above steps until the stoping criterion is met.

• CBCC2 [1]

1. Exploration: Optimize all components for one cycle each and measure improvements in fitness values (∆��)

2. Exploitation: Optimize the most contributing component (�∗ � arg max�∈ �..�

∆��)

over and over until the improvement is negligible.

3. Repeat the above steps until the stoping criterion is met.

[1] M. N. Omidvar, X. Li, and X. Yao. "Smart use of computational resources based on contribution for cooperative co-evolutionary algorithms." In Proceedings of the GECCO’11

CEC 2015, Sendai, Japan 8Sensitivity Analysis of CBCC Algorithms

Page 9: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Contributions

• Previous works:

– CBCC1 and CBCC2 were only studied under perfect conditions:

– Ideal decomposition (assuming the structure of the problem is known)

– High level imbalance

• This work:

– Studies CBCC1 and CBCC2 under more realistic conditions:

– Noisy decompositions

– Low and medium levels of imbalance

CEC 2015, Sendai, Japan 9Sensitivity Analysis of CBCC Algorithms

Page 10: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Research Questions

1. To what extent CBCC is sensitive to the accuracy of decomposition techniques?

2. In the presence of decomposition errors, is it still beneficial to employ CBCC instead of traditional CC ?

3. To what extent the imbalance level influences the performance of CBCC?

4. Is it still worthwhile to choose CBCC over traditional CC when the level of imbalance is unknown (or known but not very significant)?

CEC 2015, Sendai, Japan 10Sensitivity Analysis of CBCC Algorithms

Page 11: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Research Questions

1. To what extent CBCC is sensitive to the accuracy of decomposition techniques?

2. In the presence of decomposition errors, is it still beneficial to employ CBCC instead of traditional CC?

3. To what extent the imbalance level influences the performance of CBCC?

4. Is it still worthwhile to choose CBCC over traditional CC when the level of imbalance is unknown (or known but not very significant)?

CEC 2015, Sendai, Japan 11Sensitivity Analysis of CBCC Algorithms

1 and 2 address the Decompositionaccuracy

3 and 4 address the Imbalance level

Page 12: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Experiments

• Part A: Decomposition Accuracy

– Design: Randomly select a percentage of variables from all components and aggregate them into a new group (unlabelled group).

– Outcome: The resulting component contains variables from all other groups (strong interactions with all other components)

– Error levels: 0% (ideally noise-free), 5%, 10%, 20%, 30% and 50%

• Part B: Imbalance Level

– Design: In * � � ∑ #+ . *+��+�,+�- , #+ � 100.1�2,-�,

– Outcome: Problems with varying levels of imbalance

– Imbalance level: 3 � 0,1,2,3

CEC 2015, Sendai, Japan 12Sensitivity Analysis of CBCC Algorithms

Page 13: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Experiments Setup

• Benchmark– CEC 2013 LSGO Benchmarks

– Dimensions:1,000

– Budget: 3,000,000 function evaluations

– Categories

1. fully separable functions (f1 - f3),

2. partially separable functions with a separable subcomponent (f4 - f7),

3. partially separable functions with no separable subcomponents (f8 - f11),

4. overlapping functions (f12 - f14),

5. fully non-separable function (f15).

• Statistical Tests– Kruskal-Wallis rank sum test to obtain p-values.

– Wilcoxon rank sum test for pair-wise comparisons.

CEC 2015, Sendai, Japan 13Sensitivity Analysis of CBCC Algorithms

Page 14: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

ResultsPart A: WDL & Discussion

• As the decomposition noise level increases, the performance of CBCCs drops.

• Overall, CBCC1 performs better than CBCC2.

• Except for the very poor decomposition (50% accuracy), CBCC1 either outperforms or works statistically similar to traditional CC.

• It is beneficial to employ CBCC1 instead of traditional CC even in when the decomposition is not very accurate.

CEC 2015, Sendai, Japan 14Sensitivity Analysis of CBCC Algorithms

Page 15: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

ResultsPart B: WDL & Discussion

• As the imbalance level increases, the performance of CBCCs improves.

• Overall, CBCC1 performs better than CBCC2.

• In all cases, CBCC1 either outperforms or works statistically similar to traditional CC.

• It is beneficial to employ CBCC1 instead of traditional CC even when the imbalance level is not very high.

CEC 2015, Sendai, Japan 15Sensitivity Analysis of CBCC Algorithms

Page 16: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

ResultsMeasures

• Self-improvement:

– Reflects how much the performance of an algorithm varies when the noise or imbalance level changes?

–60789 :, ;: < �8=>?@ +,A:2 B8=>?@ +,A:C

8=>?@ +,A:2D 100

• Relative improvement:

– Shows how good/bad an algorithm performs in comparison with the baseline (i.e., DECC)?

–6E78FG+H7 I, ;: < �8=>?@ J,A:C B8=>?@ +,A:C

8=>?@ J,A:CD 100

CEC 2015, Sendai, Japan 16Sensitivity Analysis of CBCC Algorithms

Page 17: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

• Major trend: Increasing noise level reduces the performance of CC and CBCCs in most cases because:

– Increases the interaction between components (less separability) .

– Creates very large components (up to 500 variables)

ResultsPart A: Self-improvement

CEC 2015, Sendai, Japan 17Sensitivity Analysis of CBCC Algorithms

Page 18: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

• Minor trend: Increasing noise level advances the performance of CC and CBCC1 in some cases (i.e., f6) because:

– The noise may spread the variables of the most contributing component which results in less imbalanced problem.

ResultsPart A: Self-improvement

CEC 2015, Sendai, Japan 18Sensitivity Analysis of CBCC Algorithms

Page 19: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

• Main trend: Increasing noise level decreases the relative improvements of both CBCCs.

• Minor trend: CBCC2 is more sensitive to the decomposition accuracy than CBCC1.

ResultsPart A: Relative-improvement

CEC 2015, Sendai, Japan 19Sensitivity Analysis of CBCC Algorithms

Page 20: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

• Main trend: Increasing imbalance level Improves the relative improvements of both CBCCs.

• Minor trend: CBCC2 is very sensitive to the imbalance level, while CBCC1 works relatively better in almost all situations.

ResultsPart B: Relative-improvement

CEC 2015, Sendai, Japan 20Sensitivity Analysis of CBCC Algorithms

Page 21: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Conclusionand Future Work

Conclusion:

1. The CBCC framework is still effective even when the decomposition accuracy is poor, or the imbalance is marginal.

2. The CBCC1 is more effective in realistic settings than the CBCC2.

Future Work:

1. Studying the sensitivity of CBCC to other types of decomposition errors.– Breaking a non-separable component into some smaller components

– Merging some separable components into one larger components

2. Investigating the sensitivity of CC (in general) and CBCC (in particular) to the cycle length of the sub-problem optimizer.– Accuracy vs. Risk of concept drift

3. Improving existing CBCC variants:– A better exploration-exploitation balance will result in a more efficient algorithm.

CEC 2015, Sendai, Japan 21Sensitivity Analysis of CBCC Algorithms

Page 22: A sensitivity analysis of contribution-based cooperative co-evolutionary algorithms (short)

Thank you☺☺☺☺

Any question or comment?

22CEC 2015, Sendai, Japan Sensitivity Analysis of CBCC Algorithms