< Terug naar vorige pagina

Publicatie

Adaptive Direct Search for Single- and Multi-Objective Black-Box Optimization

Boek - Dissertatie

With the growth of computational power in the last decades, the penetration of numerical methods in engineering workflows has drastically increased. Advanced design and analysis programs have become standard tools in the engineering practice. Concurrently, the state of the art in numerical optimization has also been steadily advancing. Given the close interconnection between engineering and optimization, it is no surprise that the trend towards combining powerful numerical modeling and optimization tools has already started. However, there is still a mismatch between the two disciplines. For one, common assumptions of differentiability and transparency often do not hold for problems based on numerical simulations. Furthermore, it is often impossible to cast a design problem as an optimization with just a single objective. This dissertation aims to address the aforementioned challenges by proposing novel methods for black-box optimization with one or multiple objective functions. The developed methods are fully non-intrusive, require no derivative information, and are robust to irregular phenomena such as hidden constraints and numerical noise. They are designed for efficiency in terms of black-box evaluations and generate feasible iterates at all times, such that restrictions on the computational budget can be easily taken into account. The multi-objective methods proposed in this work approximate the full Pareto front, giving as much information as possible on design trade-offs. On a conceptual level, the algorithms are designed to be understood, implemented, and used with minimal expert intervention. Concerning single-objective optimization, the primary contribution is the gradient-informed generating set search (GIGS) method. GIGS is a directional direct search method for continuous, bound-constrained optimization. The method does not make explicit use of function gradients, but approximates the function topography by forming simplex gradients over the iterations. This information is used to redirect the search, resulting in an adaptive method that offers accelerated convergence without sacrificing robustness. In the present work, the algorithm is first clearly defined and its convergence is analyzed. Then, it is shown that GIGS compares favorably with state-of-the-art direct search methods, both when applied to academic benchmark problems and to realistic engineering use cases. On the front of multi-objective optimization, this dissertation introduces steepest-descent direct multisearch (SD-DMS), an adaptive directional direct search method for continuous optimization with multiple objectives. Like GIGS, SD-DMS exploits the particular form of directional methods to compute simplex gradients without additional function evaluations. Based on the information present in the simplex gradient, the search directions are chosen adaptively so as to attempt improving all objectives simultaneously. In the final part of this dissertation, the SD-DMS algorithms are outlined and several strategies for reducing the overall computational load are provided. In addition, several numerical test cases are reported, in which the performance of SD-DMS is compared to other direct search methods representing the state of the art.
Jaar van publicatie:2020
Toegankelijkheid:Open