Aula Beltrami, Dipartimento di Matematica - Tuesday, December 10, 2019 h.16:00
Abstract. In adaptive precision optimization, the objective function can only be computed with some stochastic errors with controllable standard deviation. A common strategy is to monotonically diminish this standard deviation during the optimization process, making it asymptotically converge to zero to ensure convergence of algorithms because the noise is dismantled. The algorithm MpMads presented in this work follows this approach. However, a second algorithm called DpMads is introduced to explore another strategy which does not force the standard deviation to monotonically diminish. Although these strategies are proved to be theoretically equivalent, some tests shows practical differences. Derivative-free optimization framework (no
assumption is used about the derivability of the objective function) is used, as MpMads and DpMads generalise the deterministic Mads algorithm designed for such a framework.
This is a joint work with Prof. Charles Audet and Sébastien Le Digabel from Polytechnique Montréal, and Stéphane Alarie from Hydro Québec research centre.
Università degli Studi di Pavia -
Via Ferrata, 5 - 27100 Pavia
Tel +39.0382.985600 - Fax +39.0382.985602