论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

声明: 只作为自己阅读论文的相关笔记记录,理解有误的地方还望指正 

论文下载链接:

https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8456559

概念:数据驱动?

Solving evolutionary optimization problems driven by data collected in simulations, physical experiments, production processes, or daily life are termed data-driven evolutionary optimization.

现象:

Most existing research on EAs is based on an implicit assumption that evaluating the objectives and constraints of candidate solutions is easy and cheap. However, such cheap functions do not exist for many real-world optimization problems.

EA盛行基于假设评估目标以及候选解约束条件简单而且评估的代价低廉,问题就是它只是现实的一种近似逼近,以及应用起来评估的代价还是挺高的。

 数据驱动中数据难点

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

文章的目的:

This paper aims to provide an overview of recent advances in the emerging research area of data-driven evolutionary optimization 

  文章的整体 框架:

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

数据驱动进化算法的主要的组成

 

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

数据的分类:

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

Off-line 的存在的问题以及相应的处理方法

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

而对于On-line Data-driven Optimization Methodologies的关键点以及难点就是:

怎么样采样? 针对新的数据进行采样?新的数据主要的目的是用来进行一步步的修订模型的,若是采样不当很可能使得构建的模型在结构上发生变化  

论文: Data-Driven Evolutionary Optimization: An Overview and case studies(1) 数据驱动概念,文章结构,大数分类

ON-line Data-drive methodologes sample

Promising samples are located around the optimum of the surrogate model, and the accuracy of the surrogate model in the promising area is enhanced once the promising solutions are sampled [8], [14].
Uncertain samples are located in the search space where the surrogate model is likely to have a large approximation error and has not been fully explored by the EA 

[8] Y. Jin, M. Olhofer, and B. Sendhoff, “On evolutionary optimization with approximate fitness functions,” in Proceedings of the Genetic and Evolutionary Computation Conference. Morgan Kaufmann Publishers Inc., 2000, pp. 786–793.

[14] Y. Jin, M. Olhofer, and B. Sendhoff, “A framework for evolutionary optimization with approximate fitness functions,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 5, pp. 481–494, 2002.