Global Solutions to Nonconvex Problems by Evolution of Hamilton-Jacobi PDEs
Howard Heaton, Samy Wu Fung, Stanley Osher
Global Solutions to Nonconvex Problems by Evolution of Hamilton-Jacobi PDEs
Computing tasks may often be posed as optimization problems. The objective functions for real-world scenarios are often nonconvex and/or nondifferentiable. State-of-the-art methods for solving these problems typically only guarantee convergence to local minima. This work presents Hamilton-Jacobi-based Moreau adaptive descent (HJ-MAD), a zero-order algorithm with guaranteed convergence to global minima, assuming continuity of the objective function. The core idea is to compute gradients of the Moreau envelope of the objective (which is “piece-wise convex”) with adaptive smoothing parameters. Gradients of the Moreau envelope (i.e., proximal operators) are approximated via the Hopf-Lax formula for the viscous Hamilton-Jacobi equation. Our numerical examples illustrate global convergence.
Global optimization / Moreau envelope / Hamilton-Jacobi / Hopf-Lax / Cole-Hopf / Proximals / Zero-order optimization
[1.] |
|
[2.] |
|
[3.] |
|
[4.] |
|
[5.] |
|
[6.] |
|
[7.] |
|
[8.] |
|
[9.] |
|
[10.] |
|
[11.] |
|
[12.] |
|
[13.] |
Davis, D., Drusvyatskiy, D.: Stochastic subgradient method converges at the rate o ( k - 1 / 4 ) \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$o (k^{-1/4})$$\end{document} on weakly convex functions. arXiv:1802.02988 (2018)
|
[14.] |
|
[15.] |
|
[16.] |
|
[17.] |
|
[18.] |
|
[19.] |
|
[20.] |
|
[21.] |
|
[22.] |
Jones, E., Oliphant, T., Peterson, P., et al. SciPy: open source scientific tools for Python. http://www.scipy.org/ (2001)
|
[23.] |
|
[24.] |
Kim, B., Cai, H., McKenzie, D., Yin, W.: Curvature-aware derivative-free optimization. arXiv:2109.13391 (2021)
|
[25.] |
Kingma , D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR 2015, San Diego, CA (2015)
|
[26.] |
|
[27.] |
Kozak, D., Becker, S., Doostan, A., Tenorio, L.: Stochastic subspace descent. arXiv:1904.01145 (2019)
|
[28.] |
|
[29.] |
Kozak, D., Molinari, C., Rosasco, L., Tenorio, L., Villa, S.: Zeroth order optimization with orthogonal random directions. arXiv:2107.03941 (2021)
|
[30.] |
|
[31.] |
|
[32.] |
|
[33.] |
|
[34.] |
|
[35.] |
|
[36.] |
|
[37.] |
|
[38.] |
|
[39.] |
|
[40.] |
|
[41.] |
|
[42.] |
Shi, H.-J. M., Xie,Y., Xuan, M. Q., Nocedal, J.: Adaptive finite-difference interval estimation for noisy derivative-free optimization. arXiv:2110.06380 (2021)
|
[43.] |
Shi, H.-J.M., Xuan, M.Q., Oztoprak, F., Nocedal, J.: On the numerical performance of derivative-free optimization methods based on finite-difference approximations. arXiv:2102.09762 (2021)
|
[44.] |
|
[45.] |
Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets. Retrieved June 23. http://www.sfu.ca/~ssurjano (2021)
|
[46.] |
|
/
〈 | 〉 |