دورية أكاديمية

A new accelerated conjugate gradient method for large-scale unconstrained optimization

التفاصيل البيبلوغرافية
العنوان: A new accelerated conjugate gradient method for large-scale unconstrained optimization
المؤلفون: Yuting Chen, Mingyuan Cao, Yueting Yang
المصدر: Journal of Inequalities and Applications, Vol 2019, Iss 1, Pp 1-13 (2019)
بيانات النشر: SpringerOpen, 2019.
سنة النشر: 2019
المجموعة: LCC:Mathematics
مصطلحات موضوعية: Conjugate gradient, Descent condition, Dai–Liao conjugacy condition, Global convergence, Large-scale unconstrained optimization, Mathematics, QA1-939
الوصف: Abstract In this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.
نوع الوثيقة: article
وصف الملف: electronic resource
اللغة: English
تدمد: 1029-242X
Relation: http://link.springer.com/article/10.1186/s13660-019-2238-9; https://doaj.org/toc/1029-242X
DOI: 10.1186/s13660-019-2238-9
URL الوصول: https://doaj.org/article/71e4564757024270b57bfcebe2ff42e8
رقم الأكسشن: edsdoj.71e4564757024270b57bfcebe2ff42e8
قاعدة البيانات: Directory of Open Access Journals
الوصف
تدمد:1029242X
DOI:10.1186/s13660-019-2238-9