Optimization: beyond the Fermat Optimality Rules

As part of his research in the framework of a project funded by the Programme Gaspard Monge pour l'optimisation, la recherche opérationnelle et leurs interactions avec les sciences des données (PGMO) of the Jacques Hadamard Mathematical Foundation, Sorin-Mihai Grad, a teacher-researcher at ENSTA's Applied Mathematics Unit, has just published an important article in the most prestigious journal in the field of optimization, the SIAM Journal on Optimization.

With his two co-authors, Malek Abbasi and Michel Théra, Sorin-Mihai Grad proposes in this article new methods to extend optimization methods beyond differentiable functions. Indeed, one of the steps usually required to solve an optimization problem is to find at which points the derivative of the objective function is equal to zero (that is the classical Fermat optimality condition), which corresponds to a zero slope and therefore to either a maximum or a minimum (in the case of convex functions).

Of course, things get more complicated for non-differentiable functions. If a function is non-derivable and also non-convex, there are theoretical results but no general algorithms for finding minimizers of such functions. In the literature there are several notions of subdifferential, which are mathematical objects mimicking the role of the derivative for non-differentiable functions, but they are usually hard to determine iteratively, and, consequently, virtually impossible to employ in practical algorithms.

The solution described by Sorin-Mihai Grad and his co-authors consists in proposing a type of partial subdifferentials, which opens the way to the development of algorithms for minimizing non-differentiable and non-convex functions of several variables. This is the first work where partial subdifferentials are obtained by breaking down a certain subdifferential into one-dimensional components through basis elements and play the same role as the partial derivatives in the Fermat optimality rules.

Sorin-Mihaï Grad, enseignant chercheur de l'Unité de mathématiques appliquées d'ENSTA Paris
Sorin-Mihaï Grad, enseignant chercheur de l'Unité de mathématiques appliquées de l'ENSTA

Because functions having one-dimensional variables are easier to handle, it is expected that the approach proposed in this work leads to developing new practical algorithms for minimizing non-convex non-differentiable functions, a type of functions that appear often when modeling various applications, for instance in Machine Learning and Artificial Intelligence.

Sorin-Mihai Grad has also presented these results at several scientific events, such as last year’s SMAI-MODE Conference in Lyon, and the Workshop on Nonsmooth Optimization and Applications (NOPTA 2024) in Antwerp, where he delivered a semi-plenary talk.