• 论文 • 上一篇    

求解带线性约束的凸优化的一类自适应不定线性化增广拉格朗日方法

马玉敏, 蔡邢菊   

  1. 南京师范大学数学科学学院, 南京 210023
  • 收稿日期:2021-08-30 出版日期:2022-05-14 发布日期:2022-05-06
  • 基金资助:
    国家自然科学基金项目(11871279)资助.

马玉敏, 蔡邢菊. 求解带线性约束的凸优化的一类自适应不定线性化增广拉格朗日方法[J]. 计算数学, 2022, 44(2): 272-288.

Ma Yumin, Cai Xingju. AN ADAPTIVE INDEFINITE LINEARIZED AUGMENTED LAGRANGIAN METHOD FOR CONVEX OPTIMIZATION WITH LINEAR CONSTRAINTS[J]. Mathematica Numerica Sinica, 2022, 44(2): 272-288.

AN ADAPTIVE INDEFINITE LINEARIZED AUGMENTED LAGRANGIAN METHOD FOR CONVEX OPTIMIZATION WITH LINEAR CONSTRAINTS

Ma Yumin, Cai Xingju   

  1. School of Mathematical Sciences, Nanjing Normal University, Nanjing 210023, China
  • Received:2021-08-30 Online:2022-05-14 Published:2022-05-06
增广拉格朗日方法是求解带线性约束的凸优化问题的有效算法.线性化增广拉格朗日方法通过线性化增广拉格朗日函数的二次罚项并加上一个临近正则项,使得子问题容易求解,其中正则项系数的恰当选取对算法的收敛性和收敛速度至关重要.较大的系数可保证算法收敛性,但容易导致小步长.较小的系数允许迭代步长增大,但容易导致算法不收敛.本文考虑求解带线性等式或不等式约束的凸优化问题.我们利用自适应技术设计了一类不定线性化增广拉格朗日方法,即利用当前迭代点的信息自适应选取合适的正则项系数,在保证收敛性的前提下尽量使得子问题步长选择范围更大,从而提高算法收敛速度.我们从理论上证明了算法的全局收敛性,并利用数值实验说明了算法的有效性.
Augmented Lagrangian method is an effective algorithm for solving convex optimization problems with linear constraints. While the primal subproblem needs to be solved iteratively, a common technique is to linearize the objective function of the subproblem and add a regularization term (equivalently to add a proximal term on the objective function of the subproblem), which is called the linearized augmented Lagrangian method. The proper selection of the regularization parameter is crucial to the convergence and convergence rate of the algorithm. The larger parameter can ensure the convergence of the algorithm, but it may lead to small stepsizes. The smaller parameter, however, permits larger stepsizes, but it may lead to divergence. In this paper, we consider the convex optimization problems with linear equality or inequality constraints. We design a class of adaptive indefinite linearized augmented Lagrangian method, that is, we use the information of the current iteration point to select the appropriate parameter of the regularization term adaptively, and try to make the primal stepsize larger with the guarantee of convergence, so as to improve the convergence rate of the algorithm. We theoretically prove the global convergence of the algorithm, and use numerical experiments to illustrate the effectiveness of the algorithm.

MR(2010)主题分类: 

()
[1] 姜帆, 刘雅梅, 蔡邢菊.一类自适应广义交替方向乘子法[J].计算数学, 2018, 040(004):367-386.
[2] Cristianini N, Shawe-Taylor J. An introduction to support vector machines and other kernel-based learning methods[M]. Cambridge university press, 2000.
[3] Dong X M, Cai X J, Han D R. Prediction-correction method with BB step sizes[J]. Frontiers of Mathematics in China, 2018.
[4] Deng Z, Yue M C, So M C. An efficient augmented Lagrangian-based method for linear equalityconstrained lasso[C]. ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2020, 5760-5764.
[5] Gabay D, Mercier B. A dual algorithm for the solution of nonlinear variational problems via finite element approximation[J]. Computers and Mathematics with Applications, 1976, 2(1):17-40.
[6] Glowinski R, Marroco A. Sur l'approximation, par éléments finis d'ordre un, et la résolution, par pénalisation-dualité d'une classe de problèmes de Dirichlet non linéaires[J]. Revue Française d'Automatique, Informatique, Recherche Opérationelle, 1975, 2:41-76.
[7] Hestenes M R. Multiplier and gradient methods[J]. Journal of Optimization Theory and Applications, 1969, 4(5):303-320.
[8] He B S, Ma F, Yuan X M. Optimal proximal augmented Lagrangian method and its application to full Jacobian splitting for multi-block separable convex minimization problems[J]. IMA Journal of Numerical Analysis, 2019, 00:1-29.
[9] He B S, Ma F, Yuan X M. Optimally linearizing the alternating direction method of multipliers for convex programming[J]. Computational Optimization and Applications, 2020, 75(2):361-388.
[10] He B S, Xu S J, Yuan J. Indefinite linearized augmented Lagrangian method for convex programming with linear inequality constraints[J]. arXiv:2105.02425v1, 2021.
[11] James G M, Paulson C, Rusmevichientong P. The constrained lasso[C]. Refereed Conference Proceedings, 2012, 31:4945-4950.
[12] Powell M J D. A method for nonlinear constraints in minimization problems[J]. Optimization(Fletcher R. ed.), New York:Academic Press, 1969, 283-298.
[13] Robbins H, Siegmund D. A Convergence Theorem for Non Negative Almost Supermartingales and Some Applications[J]. Optimizing Methods in Statistics, 1971:233-257.
[14] Starck J L, Murtagh F, Fadili J M. Sparse image and signal processing:wavelets, curvelets, morphological diversity[M]. Cambridge university press, 2010.
[15] Sra S, Nowozin S, Wright S J. Optimization for machine learning[M]. Mit Press, Cambridge, MA, 2012.
[16] Yang J F, Yuan X M. Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization[J]. Mathematics of Computation, 2012, 82(281):301-329.
[1] 尹江华, 简金宝, 江羡珍. 凸约束非光滑方程组一个新的谱梯度投影算法[J]. 计算数学, 2020, 42(4): 457-471.
[2] 戴小英. 电子结构计算的数值方法与理论[J]. 计算数学, 2020, 42(2): 131-158.
[3] 林济铿, 袁恺明, 申丹枫, 罗萍萍, 刘阳升. 自适应稀疏伪谱逼近新方法[J]. 计算数学, 2020, 42(1): 80-100.
[4] 张纯, 贾泽慧, 蔡邢菊. 广义鞍点问题的改进的类SOR算法[J]. 计算数学, 2020, 42(1): 39-50.
[5] 付姚姚, 曹礼群. 矩阵形式二次修正Maxwell-Dirac系统的多尺度算法[J]. 计算数学, 2019, 41(4): 419-439.
[6] 陈圣杰, 戴彧虹, 徐凤敏. 稀疏线性规划研究[J]. 计算数学, 2018, 40(4): 339-353.
[7] 姜帆, 刘雅梅, 蔡邢菊. 一类自适应广义交替方向乘子法[J]. 计算数学, 2018, 40(4): 367-386.
[8] 郭科, 韩德仁. 单调算子理论与分裂算法[J]. 计算数学, 2018, 40(4): 418-435.
[9] 葛志昊, 葛媛媛. 几乎不可压线弹性问题的新的Uzawa型自适应有限元方法[J]. 计算数学, 2018, 40(3): 287-298.
[10] 王福胜, 张瑞. 不等式约束极大极小问题的一个新型模松弛强次可行SQCQP算法[J]. 计算数学, 2018, 40(1): 49-62.
[11] 徐海文, 孙黎明. 一类凸优化的加速混合下降算法[J]. 计算数学, 2017, 39(2): 200-212.
[12] 刘金魁. 解凸约束非线性单调方程组的无导数谱PRP投影算法[J]. 计算数学, 2016, 38(2): 113-124.
[13] 刘亚君, 刘新为. 无约束最优化的信赖域BB法[J]. 计算数学, 2016, 38(1): 96-112.
[14] 简金宝, 尹江华, 江羡珍. 一个充分下降的有效共轭梯度法[J]. 计算数学, 2015, 37(4): 415-424.
[15] 刘会坡. 中子输运方程误差估计及自适应计算[J]. 计算数学, 2015, 37(3): 264-272.
阅读次数
全文


摘要