Derivative-Free Optimization (DFO)
Derivative-Free Optimization (DFO)
Disciplines
Computer Sciences (60%); Mathematics (40%)
Keywords
-
Black box optimization,
Noisy optimization,
Benchmarking and tuning,
Constraints,
Solver selection,
Test enviroment
To improve existing products or procedures one must pick the best values for all variables in their design which influence the quality of the product or procedure. The choice of these decision variables also requires to ensure that necessary constraints are satisfied. For example, some variables can take all values between some lowest and some highest value; other variables can take only a few or even only two values (e.g., on and off). Often the sum of some values must be below or above some threshold, etc. In many applications in science and engineering, the quantitative assessment of their quality is based on expensive measurements or complex simulations. Since no mathematical formula is available for this, the search for the best design decision is like the search for the deepest point in a dark lake where the only information that can be gathered are local depth measurements by a boat on the surface. One can utilize such measurements to predict the shape of the bottom of the lake well enough to guide the boat towards the deepest spot. Good optimization methods keep the number of measurements needed to arrive at the deepest spot small. General optimization problems are solved in an analogous way. In place of the depth meter a compex black box, e.g., a piece of measurement or simulation software, returns upon the input of a choice of decision variables (the position of the boat) a value (the depth measurement) that encodes the closeness to the goal. From the previous outputs of the black box a black box solver calculates new values for the decision variables as input for the black box, in a way that one ultimately arives at the optimal decision (boat above the deepeest spot). Our group has long-term expertise in the design of optimization algorithms. Our newest black box solvers represent the state-of-the-art in the most important case, where the decision variables are not subject to constraints. In our project we want to extend our solvers so thatthey can handle arbitrary constraints. We aim at a robust (rearely failing) and efficient (with few calls of the black box succeeding) software that copes with a large number of decision variables and with expensive, often inaccurate values for the quality measure. A final, important part of the project is the tuning of the software to be developed and its comparison with existing black box optimization packages. Tuning refers to the process of choosing the parameters of a solver design (which affect details of its behavior) in a way that its performance is optimized. This is itself a black box optimization problem. Thus we shall do the tuning by applying our black box solver in a Münchhausen-like fashion to tune itself.
This two year project lead to a software suite for efficently solving optimization problems in which the objective to be optimized is available only through a black box providing a score for any given scenario. Optimization is the basic technique for improving existing products or procedures to better match a desired objective under restrictions encoded into the constraints. A growing number of applications in science and engineering have objectives without well-defined functional description, e.g., when function values arise from expensive measurements or complex simulations. This fueled a resurgence of interest in derivative-free optimization algorithms, since these specifically work with functions and constraints for which no derivative information is available. From the previous outputs of the black box a black box solver calculates new values for the decision variables as input scenarios for the black box, in a way that one ultimately arrives at the optimal decision Our group has long-term expertise in derivative-free optimization, contributing both to deterministic and stochastic algorithms with theoretical convergence guarantees and to easily parallelizable stochastic algorithms based on the heuristic adaptation of populations of good points. In this project, we extended the methods implemented in our earlier unconstrained black box solvers to the case of bound constraints (lower and upper bounds on decision variables), integer constraints (where certain decision variables can take only integral values), and linear inequality and equality constraints (where certain linear expressions must match or must exceed a given bound). We aimed at a robust and efficient derivative-free solver suite that copes with expensive, often noisy function values and high dimensions, the bottleneck of current techniques. We exploited in particular local effective low-dimensionality by employing appropriate subspace techniques (that concentrate the search by focusing on adaptively constructed problems with a few auxiliary variables only). We integrated into our solvers various known, and some new techniques for handling the bound constraints, integer constraints, and linear constraints, selecting the best from many possible variants tried. To facilitate the testing of our algorithms and their comparison with those of others, we extended our recent optimization test environment to the constrained case. The software released includes the test environment, the new solver suite, and some of the most competitive algorithms of others. The associated publications demonstrate the theoretical efficiency and the practical state-of-the-art performance compared with other solvers. We also implemented a preliminary (and not yet released) self-tuning environment that applies the new solvers to optimize their parameters and their available choices.
- Universität Wien - 100%
Research Output
- 6 Publications
-
2025
Title MATRS: heuristic methods for noisy derivative-free bound-constrained mixed-integer optimization DOI 10.1007/s12532-025-00281-3 Type Journal Article Author Kimiaei M Journal Mathematical Programming Computation -
2023
Title A subspace inertial method for derivative-free nonlinear monotone equations DOI 10.1080/02331934.2023.2252849 Type Journal Article Author Hassan Ibrahim A Journal Optimization -
2024
Title An active set method for bound-constrained optimization DOI 10.1080/10556788.2024.2339215 Type Journal Article Author Azmi B Journal Optimization Methods and Software -
2024
Title An improvement of the Goldstein line search DOI 10.1007/s11590-024-02110-3 Type Journal Article Author Kimiaei M Journal Optimization Letters -
2024
Title Globally linearly convergent nonlinear conjugate gradients without Wolfe line search DOI 10.1007/s11075-024-01764-5 Type Journal Article Author Kimiaei M Journal Numerical Algorithms -
2024
Title Effective matrix adaptation strategy for noisy derivative-free optimization DOI 10.1007/s12532-024-00261-z Type Journal Article Author Kimiaei M Journal Mathematical Programming Computation