groupe-dfo-bbo:acteurs:students:doc:romain-couderc:main-page

Romain Couderc

I came to Polytechnique Montreal for the first time during my master thesis at Grenoble-INP Ensimag in june 2019. It was in a context of a joint supervision between Jean Bigeon and Charles Audet, my current Ph.D. supervisors. Then, we decided to continue the collaboration with a Ph.D. thesis on a subject proposed by Michael Kokkolaras : the robust optimization of multidisciplinary design problems.

Supervisor in Canada: Charles Audet from Polytechnique Montréal

Supervisor in France: Jean Bigeon from Grenoble INP

Co-supervisor : Michael Kokkolaras, from McGill university

Contact : [email protected]

Abstract: The Mesh Adaptive Search Algorithm (MADS) is a widely used algorithm to solve constrained optimization. It is a derivative-free and black-box optimization algorithm. The idea of this thesis is to integrate the Cross Entropy (CE) method into this algorithm as a new search method for the NOMAD software. On the one hand, that allows to make a more global search of solution. On the other hands, the CE method inherit from the totality of the MADS convergence analysis. Numerical experiments show that this combination of the two algorithms seems to be promising.

An example of the Cross Entropy method is given on the following figures, suppose that you want to maximize the following function:

 test function

The Cross Entropy method, by iteratively sampling and compute the mean and standard deviation of the best points, allow to converge to the global maximum.  CE method

This work should be submitted soon, we still have to do some numerical tests.

What is robust optimization ? There is no commun answer for this question. In our case, we define it as follows:

  • The variable of the optimization problem are uncertain.
  • The solution found must be a trade off between being a good solution (i. e. the smallest as possible in context of minimization problem) and a solution not very sensitive to small variations.

Unlike stochastic optimization where the objective function is noisy. In robust optimization, the objective function is completly determinist and the uncertainties come only from the uncertain data. Thus, given a solution, it is very easy to verify whether this solution is robust or not. For instance, othe following figure, we can “see” that the minimum the most robust is attained at x=7.

Test function

The difficulty in robust optimization is rather to know: how to formulate the problem so that we can find the robust solution? There are several manners to answer to this question:

  • The simplest approach is to take the mean of the objective function according to the uncertain data. This approach is called a risk neutral approach because the deviation of the objective function around its mean is not take into account.
  • Another approach is to take the mean plus the standard deviation of the objective function. That allows to take care of the deviation around the solution but may introduce some maxima in the objective function.
  • The formulation that we have retained is a formulation coming from financial engineering called Average Value-at-Risk or Expected Shortfall. This formulation allows to take into account the deviation of the objective function without create some extrema. If we apply it to the previous objective function, we obtain the following curves in green and orange (they differ only by the number of sample points).

Test function

Finally, our current work is to study the “convexification” phenomenon appeared when we draw the curves with the average Value-at-Risk formulation.

  • groupe-dfo-bbo/acteurs/students/doc/romain-couderc/main-page.txt
  • Dernière modification: 2020/06/22 08:00
  • par coudroma