A Derivative-Free Optimization Algorithm Combining Line-Search and Trust-Region Techniques

Pengcheng Xie , Ya-xiang Yuan

Chinese Annals of Mathematics, Series B ›› 2023, Vol. 44 ›› Issue (5) : 719 -734.

PDF
Chinese Annals of Mathematics, Series B ›› 2023, Vol. 44 ›› Issue (5) : 719 -734. DOI: 10.1007/s11401-023-0040-y
Article

A Derivative-Free Optimization Algorithm Combining Line-Search and Trust-Region Techniques

Author information +
History +
PDF

Abstract

The speeding-up and slowing-down (SUSD) direction is a novel direction, which is proved to converge to the gradient descent direction under some conditions. The authors propose the derivative-free optimization algorithm SUSD-TR, which combines the SUSD direction based on the covariance matrix of interpolation points and the solution of the trust-region subproblem of the interpolation model function at the current iteration step. They analyze the optimization dynamics and convergence of the algorithm SUSD-TR. Details of the trial step and structure step are given. Numerical results show their algorithm’s efficiency, and the comparison indicates that SUSD-TR greatly improves the method’s performance based on the method that only goes along the SUSD direction. Their algorithm is competitive with state-of-the-art mathematical derivative-free optimization algorithms.

Keywords

Nonlinear optimization / Derivative-Free / Quadratic model / Line-Search / Trust-Region

Cite this article

Download citation ▾
Pengcheng Xie, Ya-xiang Yuan. A Derivative-Free Optimization Algorithm Combining Line-Search and Trust-Region Techniques. Chinese Annals of Mathematics, Series B, 2023, 44(5): 719-734 DOI:10.1007/s11401-023-0040-y

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Al-Abri S, Lin T X, Tao M, Zhang F. A derivative-free optimization method with application to functions with exploding and vanishing gradients. IEEE Contr. Syst. Lett., 2020, 5(2): 587-592

[2]

Audet C, Hare W. Derivative-free and Blackbox Optimization, 2017, Heidelberg: Springer-Verlag

[3]

Bezerra M, Santelli R, Oliveira E Response surface methodology (RSM) as a tool for optimization in analytical chemistry. Talanta, 2008, 76: 965-977

[4]

Conn A R, Scheinberg K, Vicente L N. Geometry of sample sets in derivative free optimization, part II: Polynomial regression and underdetermined interpolation. IMA J. Numer. Anal., 2008, 28: 721-748

[5]

Conn A R, Scheinberg K, Vicente L N. Introduction to Derivative-free Optimization, 2009, Philadelphia: SIAM

[6]

Gill P E, Murray W. Quasi-Newton methods for unconstrained optimization. IMA J. of Appl. Math., 1972, 9(1): 91-108

[7]

Gill P E, Murray W, Saunders M A, Wright M H. Computing forward-difference intervals for numerical optimization. SIAM J. Sci. Statist. Comput., 1983, 4(2): 310-321

[8]

Kelley C T. Implicit Filtering, 2011, Philadelphia: SIAM

[9]

Khalil H K. Nonlinear Systems, 2002, Upper Saddle River: Prentice Hall

[10]

Kolda T, Lewis R, Torczon V. Optimization by direct search: New perspectives on some classical and modern methods. SIAM Rev., 2003, 45: 385-482

[11]

Larson J, Menickelly M, Wild S M. Derivative-free optimization methods. Acta Numer., 2019, 28: 287-404

[12]

More J J, Wild S M. Benchmarking derivative-free optimization algorithms. SIAM J. Optim., 2009, 20(1): 172-191

[13]

Neider J A, Mead R. A simplex method for function minimization. Comput. J., 1965, 7(4): 308-313

[14]

Nocedal J, Yuan Y. Combining trust region and line search techniques, 1998, Boston: Springer-Verlag 153-175

[15]

Powell M J D. UOBYQA: Unconstrained optimization by quadratic approximation. Math. Program., 2002, 92: 555-582

[16]

Powell M J D. On trust region methods for unconstrained minimization without derivatives. Math. Program., 2003, 97: 605-623

[17]

Powell M J D. Least Frobenius norm updating of quadratic models that satisfy interpolation conditions. Math. Program., 2004, 100: 183-215

[18]

Powell M J D. The NEWUOA software for unconstrained optimization without derivatives, Large-scale nonlinear optimization, 2006, New York: Springer-Verlag 255-297

[19]

Rosenbrock H. An automatic method for finding the greatest or least value of a function. Comput. J., 1960, 3(3): 175-184

[20]

Stewart G III A modification of Davidon’s minimization method to accept difference approximations of derivatives. J. ACM, 1967, 14(1): 72-83

[21]

Torczon V. On the convergence of the multidirectional search algorithm. SIAM J. Optim., 1991, 1(1): 123-145

[22]

Tseng P. Fortified-descent simplicial search method: A general approach. SIAM J. Optim., 1999, 10: 269-288

[23]

Van Laarhoven P J M, Aarts E H L. Simulated Annealing: Theory and Applications, 1987, Netherlands, Dordrecht: Springer-Verlag

[24]

Winfield, D., Function and Functional Optimization by Interpolation in Data Tables, Ph.D. thesis, Harvard University, 1969.

[25]

Xie, P. and Yuan, Y., Least H 2 norm updating quadratic interpolation model function for derivative-free trust-region algorithms, 2023., arXiv: 2302.12017

[26]

Zhang Z. Derivative-free Optimization, China Discipline Development Strategy: Mathematical Optimization, 2021, Beijing: Science Press 84-92 (in Chinese)

AI Summary AI Mindmap
PDF

157

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/