Computational Intelligence in Expensive Optimization Problems (eBook)
800 Seiten
Springer Berlin (Verlag)
978-3-642-10701-6 (ISBN)
In modern science and engineering, laboratory experiments are replaced by high fidelity and computationally expensive simulations. Using such simulations reduces costs and shortens development times but introduces new challenges to design optimization process. Examples of such challenges include limited computational resource for simulation runs, complicated response surface of the simulation inputs-outputs, and etc.
Under such difficulties, classical optimization and analysis methods may perform poorly. This motivates the application of computational intelligence methods such as evolutionary algorithms, neural networks and fuzzy logic, which often perform well in such settings. This is the first book to introduce the emerging field of computational intelligence in expensive optimization problems. Topics covered include: dedicated implementations of evolutionary algorithms, neural networks and fuzzy logic. reduction of expensive evaluations (modelling, variable-fidelity, fitness inheritance), frameworks for optimization (model management, complexity control, model selection), parallelization of algorithms (implementation issues on clusters, grids, parallel machines), incorporation of expert systems and human-system interface, single and multiobjective algorithms, data mining and statistical analysis, analysis of real-world cases (such as multidisciplinary design optimization).
The edited book provides both theoretical treatments and real-world insights gained by experience, all contributed by leading researchers in the respective fields. As such, it is a comprehensive reference for researchers, practitioners, and advanced-level students interested in both the theory and practice of using computational intelligence for expensive optimization problems.
Preface 7
Contents 13
Part I Techniques for Resource-Intensive Problems 28
A Survey of Fitness Approximation Methods Applied in Evolutionary Algorithms 29
Introduction 29
Fitness Approximation Methods 31
Instance-Based Learning Methods 31
Machine Learning Methods 33
Statistical Learning Methods 35
Existing Research in Multi-surrogate Assisted EAs 37
Comparative Studies for Different Approximate Models 40
The Working Styles of Fitness Approximation 43
Direct Fitness Replacement Methods 43
Indirect Fitness Approximation Methods 43
The Management of Fitness Approximation 44
Evolution Control 44
Offline Model Training 45
Online Model Updating 45
Hierarchical Approximate Models and Model Migration 46
Case Studies: Two Surrogate-Assisted EA Real-World Applications 46
The Welded Beam Design Domain 46
Supersonic Aircraft Design Domain 47
Final Remarks 49
References 50
A Review of Techniques for Handling ExpensiveFunctions in Evolutionary Multi-Objective Optimization 55
Introduction 55
Basic Concepts 56
Pareto Dominance 57
Pareto Optimality 57
Pareto Front 58
Knowledge Incorporation 58
Surrogates 59
Polynomials: Response Surface Methods (RSM) 60
Gaussian Process or Kriging 61
Radial Basis Functions 62
Artificial Neural Networks 63
Support Vector Machines 64
Clustering 66
Fitness Inheritance 67
Real-World Applications 68
Use of Problem Approximation 69
Use of RSM by Polynomial Approximation 71
Use of Artificial Neural Networks 72
Use of a Gaussian Process or Kriging 74
Use of Clustering 78
Use of Radial Basis Functions 78
Conclusions and Future Research Paths 79
References 80
Multilevel Optimization Algorithms Based on Metamodel- and Fitness Inheritance-Assisted Evolutionary Algorithms 86
Introduction 87
Metamodel–Assisted EAs and Distributed MAEAs 89
Surrogate Evaluation Models for MAEAs 90
Fitness Inheritance 90
Radial Basis Function (RBF) Networks 91
Assessment of MAEA and DMAEA 92
Multilevel Search Algorithms and the Underlying Hierarchy 93
The Three Multilevel Modes – Defining a HDMAEA 93
Distributed Hierarchical Search – DHMAEA vs. HDMAEA 95
Assessment of Multilevel–Hierarchical Optimization 96
Optimization of an Annular Cascade 100
Conclusions 104
References 105
References 50
Knowledge-Based Variable-Fidelity Optimization of Expensive Objective Functions through Space Mapping 110
Introduction 110
Space Mapping Optimization 113
Formulation of the Space Mapping Algorithm 113
Space Mapping Surrogate Models 115
Characterization of Space Mapping 116
Practical Issues and Open Problems 117
Space Mapping Illustration 117
Space Mapping Efficiency 120
Example 1: Microstrip Bandpass Filter 120
Example 2: Ring Antenna b32 124
Discussion 126
Exploiting Extra Knowledge: Tuning Space Mapping 126
Tuning Space Mapping Formulation 127
TSM Optimization of Chebyshev Bandpass Filter 128
Summary 131
Conclusions 131
References 132
References 50
Reducing Function Evaluations Using Adaptively Controlled Differential Evolution with Rough Approximation Model 135
Introduction 135
Optimization and Approximation Models 137
Optimization Problems 137
Evolutionary Algorithms Using Approximation Models 137
Estimated Comparison Method 138
Rough Approximation Model 139
Potential Model 139
Estimated Comparison 140
Adaptive Control 141
Differential Evolution with the Estimated Comparison Method 142
Differential Evolution 142
Adaptive DE with the Estimated Comparison Method 143
Numerical Experiments 144
Test Problems 144
Conditions of Experiments 146
Experimental Results 146
Discussion 150
Conclusions 152
References 152
Kriging Is Well-Suited to Parallelize Optimization 154
Introduction 155
Motivations: Efficient Optimization Algorithms for Expensive Computer Experiments 155
Where Computational Intelligence and Kriging Meet 155
Towards Kriging-Based Parallel Optimization: Summary of Obtained Results and Outline of the Chapter 157
Background in Kriging for Sequential Optimization 158
The Ordinary Kriging Metamodel and Its Gaussian Process Interpretation 158
Kriging-Based Optimization Criteria 160
The Multi-points Expected Improvement (q-EI) Criterion 165
Analytical Calculation of 2-EI 166
q-EI Computation by Monte Carlo Simulations 170
Approximated q-EI Maximization 172
A First Greedy Strategy to Build a q-Points Design with the 1-Point EI 173
The Kriging Believer (KB) and Constant Liar (CL) Strategies 173
Empirical Comparisons with the Branin-Hoo Function 174
Towards Kriging-Based Parallel Optimization: Conclusion and Perspectives 178
Appendix 180
Gaussian Processes for Machine Learning 180
Conditioning Gaussian Vectors 180
Simple Kriging Equations 181
Ordinary Kriging Equations 182
References 183
Analysis of Approximation-Based Memetic Algorithms for Engineering Optimization 186
Memetic Algorithms and Computer-Aided Design 187
Approximation-Based Memetic Algorithms 191
Varying Accuracy in Black-Box Functions 193
Approximation-Based Local Search 194
Analysis of Memetic Algorithms 198
Convergence Analysis 198
Computational Cost 203
Numerical Results 205
Analytical Problems 205
Electromagnetic Benchmark Problem 208
Final Remarks 211
References 212
Opportunities for Expensive Optimization with Estimation of Distribution Algorithms 215
Introduction 215
Estimation of Distribution Algorithms 217
Boltzmann Estimation of Distribution Algorithms 217
Three Opportunities for Enhancing the Efficiency of EDAs 219
Right-Sized Selection 219
Probabilistic Elitism 220
Maximum Entropy 221
Evolutionary Backtracking from Log-Probability Landscapes 222
Selecting a Bayesian EDA Algorithm 223
Partial Evaluation in Boltzmann EDAs 224
A Simple Algorithm for Partial Evaluation with Backtracking 227
Regularization in Estimation of Distribution Algorithms 230
Entropic Mutation 230
Shrinkage Estimation of Distribution Algorithms 232
Summary and Conclusions 235
Appendix 236
References 238
On Similarity-Based Surrogate Models for Expensive Single- and Multi-objective Evolutionary Optimization 241
Introduction 241
The Optimization Problem 243
Surrogate-Assisted Evolutionary Optimization 243
Similarity-Based Surrogate Models (SBSMs) 244
Surrogate-Assisted Framework 246
The Surrogate-Assisted Evolutionary Algorithm 247
Computational Experiments 249
Single-Objective Optimization 251
Multi-objective Optimization 257
Concluding Remarks 266
References 267
Multi-objective Model Predictive Control Using Computational Intelligence 271
Introduction 272
Meta-modeling Using Computational Intelligence 272
Aspiration Level Approach to Interactive Multi-objective Optimization 277
Multi-objective Model Predictive Control 279
Concluding Remarks 285
References 285
Improving Local Convergence in Particle Swarms by Fitness Approximation Using Regression 287
Introduction 287
Background 289
Fitness Approximation 289
Particle Swarms 290
Speciated Particle Swarms 291
Guaranteed Convergence PSO 292
mQSO 292
Using Regression to Locate Optima 293
Experimental Setup 296
Static Functions 297
Moving Peaks 299
Results 303
Static Functions 303
Moving Peaks 305
Conclusion 313
References 313
Part II Techniques for High-Dimensional Problems 316
Differential Evolution with Scale Factor Local Search for Large Scale Problems 317
Introduction 317
Differential Evolution for Large Scale Problems 319
Differential Evolution 319
Self-Adaptive Control Parameter Update 321
Population Size Reduction 321
Scale Factor Local Search 322
Numerical Results 325
Results in 100 Dimensions 327
Results in 500 Dimensions 330
Results in 1000 Dimensions 333
Discussion about the Algorithmic Components 336
Conclusion 340
References 341
Large-Scale Network Optimization with Evolutionary Hybrid Algorithms: Ten Years’Experience with the Electric Power Distribution Industry 344
Introduction 344
Optimization of Electric Power Distribution Networks 346
Evolutionary Approach 349
Working within the Feasibility Domain 350
Lamarckian Hybridization 354
Application Examples and Illustration 356
Summary 361
References 361
A Parallel Hybrid Implementation Using Genetic Algorithms, GRASP and Reinforcement Learning for the Salesman Traveling Problem 363
Introduction 363
Theoretical Foundation 365
GRASP Metaheuristic 365
Genetic Algorithm 365
Reinforcement Learning: Q-Learning Algorithm 366
Hybrid Methods Using Metaheuristic and Reinforcement Learning 367
GRASP-Learning 368
Genetic-Learning 369
Parallel Hybrid Implementation Proposed 370
Methodology 370
Experimental Results 372
The Traveling Salesman Problem 373
Computational Test 374
Conclusions 386
References 386
An Evolutionary Approach for the TSP and the TSP with Backhauls 388
The Traveling Salesman Problem 389
Conventional TSP Heuristics 390
Metaheuristics for the TSP 391
Evolutionary TSP Algorithms 391
The TSP with Backhauls 392
Outline 393
The First Evolutionary Algorithm for the TSP 393
Generating Offspring from the Union Graph 394
Nearest Neighbor Crossover (NNX) 394
Greedy Crossover (GX) 396
Combined Use of the Crossover Operators 397
Proposed Mutation Operators 397
Other Settings of the First Evolutionary Algorithm 398
Computational Results for the TSP 400
The Second Evolutionary Algorithm for the TSP and the TSPB 404
More Than Two Parents and Multiple Offspring 404
Improved Mutation Operators 406
Computational Results for the TSP 407
Computational Results for the TSPB 408
Conclusion 411
References 412
Towards Efficient Multi-objective Genetic Takagi-Sugeno Fuzzy Systems for High Dimensional Problems 414
Introduction 414
The Takagi-Sugeno Fuzzy Systems 416
Time Complexity 417
Fast Identification of Consequent Parameters 419
Reuse of Computed Parameters 420
Reuse in the Application of Mating Operators 420
Speeding Up the Calculus of the Output through Reuse 422
Speeding Up the Calculus of the Activation Degrees through Reuse 423
The Used MOEA to Learn TS Rules 423
Experimental Results 424
Regression Problem 426
Time Series Forecasting Problem 430
Conclusions 437
References 438
Evolutionary Algorithms for the Multi Criterion Minimum Spanning Tree Problem 440
Introduction 440
Problem Definition 441
Solution Approaches 442
Knowledge-Based Evolutionary Algorithm (KEA) 446
Phases of KEA 447
Experimental Results 449
Benchmark Data Sets 449
Benchmark Algorithms 449
Performance Indicators 450
Numerical Results 451
Experimental Complexity of KEA and NSGA2 454
Scalability of KEA 457
Tri-Criterion Instances 458
Alternative Algorithms Based on KEA 459
Discussions and Analysis 464
Conclusions 465
References 466
Loss-Based Estimation with Evolutionary Algorithms and Cross-Validation 470
Introduction 470
Loss-Based Estimation with Cross-Validation 472
The Estimation Road Map 472
Estimator Selection Procedure 473
The Parameter Space for Polynomial Regression 474
Parametrization 474
Constraints 475
Size of the Parameter Space 476
Evolutionary Algorithms as Risk Optimization Procedures 479
Proposed EA 480
Simulation Studies 485
Simulation Study Design 485
Simulation Study Results 487
Data Analysis for a Diabetes Study 493
Conclusion 498
Appendix 499
References 500
Part III Real-World Applications 502
Particle Swarm Optimisation Aided MIMO Transceiver Designs 503
Introduction 503
Particle Swarm Optimisation 505
PSO Algorithm 506
Complexity of PSO Algorithm 508
Choice of PSO Algorithmic Parameters 508
PSO Aided Semi-blind Joint ML Estimation 509
MIMO System Model 509
Semi-blind Joint ML Channel Estimation and Data Detection 510
PSO Aided Semi-blind Joint ML Scheme 512
Simulation Study 513
PSO Based MBER Multiuser Transmitter Design 515
Downlink of SDMA Induced MIMO System 516
MBER MUT Design 517
PSO Aided MBER MUT Design 518
Simulation Study 519
Conclusions 524
References 524
Optimal Design of a Common Rail Diesel Engine Piston 528
Presentation 528
Introduction 529
Automatic Definition of the Computational Mesh 531
Single-objective and Multi-objective Approaches 537
Optimization Algorithms 539
Implementations on Computation Platform and Grids 540
Test Case 541
Problem Specifications 542
Engine Simulation Code 543
HIPEGEO 544
Required Computational Time 549
Analysis of the Results 549
Conclusions 552
References 554
Robust Preliminary Space Mission Design under Uncertainty 557
Introduction 557
Uncertainties in Space Mission Design 559
Modelling Uncertainties through Evidence Theory 560
Frame of Discernment U, Power Set 2U and Basic Probability Assignment 560
Belief and Plausibility Functions 562
Cumulative Functions: CBF, CCBF, CPF, CCPF 563
Solution of the OUU Problem 565
Problem Formulation 565
Direct Solution through a Population-Based Genetic Algorithm 567
Indirect Solution Approach 569
A Case Study: The BepiColombo Mission 573
Spacecraft Mass Model 573
The BPA Structure 576
Results and Comparisons 577
Direct Solution Simulations 578
Indirect Solution Simulations 580
Conclusions 582
References 582
Progressive Design Methodology for Design of Engineering Systems 585
Introduction 585
Progressive Design Methodology 587
Synthesis Phase of PDM 589
System Requirements Analysis 590
Definition of System Boundaries 591
Determination of Performance Criterion/Criteria 591
Selection of Variables and Sensitivity Analysis 592
Development of System Model 593
Deciding on the Optimization Strategy 594
Intermediate Analysis Phase of PDM 596
Identification of New Set of Criteria 598
Linguistic Term Set 598
Semantic of Linguistic Term Set 599
Aggregation Operator for Linguistic Weighted Information 599
Final Analysis Phase of PDM 603
Model Suitable for PDM 603
Synthesis Phase of PDM for Design of a BLDC Motor Drive 604
Requirement Analysis 604
Definition of System Boundaries 605
Determining of Performance Criteria 605
Development of System Model 606
Optimisation Strategy 608
Results of Multiobjective Optimisation 608
Intermediate Analysis Phase of PDM for Design of a BLDC Motor Drive 609
Identification of New Set of Objectives 611
Linguistic Term Set 612
The Semantic of Linguistic Term Set 612
Aggregation Operator for Linguistic Weighted Information 613
The Screening Process 613
Final Analysis Phase of PDM for Design of a BLDC Motor Drive 614
Detailed Simulation Model 614
Independent Design Variables and Objectives 614
Set of Solutions 616
Conclusions 618
References 619
Reliable Network Design Using Hybrid Genetic Algorithm Based on Multi-Ring Encoding 622
Reliable Network Design 622
Problem Formulation 624
Network Reliability 625
Reliability Metrics 625
Reliability Estimation 626
Previous Works 628
Genetic Algorithm 628
Ant Colony Optimization 628
Hybrid Heuristics 629
Solution Representation 630
Multi-Ring Encoding 630
Contraction Model 631
Hybrid Genetic Algorithm 633
Representation and Initialization 634
Repair 635
Fitness Evaluation 635
Parent Selection and Offspring Generation 636
Mutation 637
Local Search Ant Colony System 638
Selection for New Generation 640
Numerical Results 640
References 645
Isolated Word Analysis Using Biologically-Based Neural Networks 649
Neural Engineering Speech Recognition Using the Genetic Algorithm 649
Input Description 654
Synapse Connectivity: The Dynamic Synapse Neural Network 656
DSNN Architecture 658
Synapse Functionality 659
Synapse Optimization 662
Biased Selection: Word Micro-environment 663
Objective Function: Fitness Evaluation 664
Score Function Rational 666
Genetic Algorithm: Mutation, Elitism, and Micro-environment 669
Output Description and Analysis 670
Integrated Responses 670
Dynamic Model Emergence Via Cost-Weighted Classification 673
System Output Visualization 676
Discussion 677
Sound Processing Via the Dorsal Cochlear Nucleus 679
References 680
A Distributed Evolutionary Approach to Subtraction Radiography 683
Introduction 683
Background 685
The Image Registration Problem 685
High Performance Computing 688
A Grid Computing Framework for Medical Imaging 690
Case Study: Automatic Subtraction Radiography Using Distributed Evolutionary Algorithms 692
Problem Statement 693
Parametric Transformations 693
Similarity Measure 694
Optimization Problem 696
Interpolation Approach 696
Search Strategy 696
Algorithms Distribution 698
Algorithms Validation 699
The Subtraction Service 704
Discussion and Conclusions 706
Appendix 707
References 710
Speeding-Up Expensive Evaluations in High-Level Synthesis Using Solution Modeling and Fitness Inheritance 713
Introduction 713
Related Work 715
Design Space Exploration with Multi-Objective Evolutionary Computation 716
Design Space Exploration Core 717
Genetic Algorithm Design 718
Performance Measure 719
Solution Evaluation 719
Building Cost Models 721
Cost Models 722
Experimental Evaluation 723
Accuracy of Models 724
Performance of the Methodology 725
Fitness Inheritance 726
Inheritance Model 727
Experimental Evaluation 728
Weighting Functions 729
Parameter Analysis 732
Conclusions 733
References 733
Index 736
| Erscheint lt. Verlag | 10.3.2010 |
|---|---|
| Reihe/Serie | Adaptation, Learning, and Optimization | Adaptation, Learning, and Optimization |
| Zusatzinfo | 800 p. 270 illus. |
| Verlagsort | Berlin |
| Sprache | englisch |
| Themenwelt | Informatik ► Theorie / Studium ► Künstliche Intelligenz / Robotik |
| Mathematik / Informatik ► Mathematik | |
| Technik | |
| Schlagworte | algorithm • algorithms • Computational Intelligence • Control • Data Mining • Engineering Design Optimization • Evolution • evolutionary algorithm • evolutionary algorithms • Expensive Problems • fuzzy • Intelligence • Model • Modeling • neural network • Neural networks • Optimization • Simulation |
| ISBN-10 | 3-642-10701-X / 364210701X |
| ISBN-13 | 978-3-642-10701-6 / 9783642107016 |
| Informationen gemäß Produktsicherheitsverordnung (GPSR) | |
| Haben Sie eine Frage zum Produkt? |
Größe: 19,2 MB
DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasserzeichen und ist damit für Sie personalisiert. Bei einer missbräuchlichen Weitergabe des eBooks an Dritte ist eine Rückverfolgung an die Quelle möglich.
Dateiformat: PDF (Portable Document Format)
Mit einem festen Seitenlayout eignet sich die PDF besonders für Fachbücher mit Spalten, Tabellen und Abbildungen. Eine PDF kann auf fast allen Geräten angezeigt werden, ist aber für kleine Displays (Smartphone, eReader) nur eingeschränkt geeignet.
Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.
Zusätzliches Feature: Online Lesen
Dieses eBook können Sie zusätzlich zum Download auch online im Webbrowser lesen.
Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.
aus dem Bereich