دورية أكاديمية

Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction.

التفاصيل البيبلوغرافية
العنوان: Implicit Regularization and Momentum Algorithms in Nonlinearly Parameterized Adaptive Control and Prediction.
المؤلفون: Boffi NM; John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA 02138, U.S.A. boffi@g.harvard.edu., Slotine JE; Nonlinear Systems Laboratory, MIT, Cambridge, MA 02139, U.S.A. jjs@mit.edu.
المصدر: Neural computation [Neural Comput] 2021 Mar; Vol. 33 (3), pp. 590-673. Date of Electronic Publication: 2021 Jan 29.
نوع المنشور: Journal Article
اللغة: English
بيانات الدورية: Publisher: MIT Press Country of Publication: United States NLM ID: 9426182 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1530-888X (Electronic) Linking ISSN: 08997667 NLM ISO Abbreviation: Neural Comput Subsets: PubMed not MEDLINE; MEDLINE
أسماء مطبوعة: Original Publication: Cambridge, Mass. : MIT Press, c1989-
مستخلص: Stable concurrent learning and control of dynamical systems is the subject of adaptive control. Despite being an established field with many practical applications and a rich theory, much of the development in adaptive control for nonlinear systems revolves around a few key algorithms. By exploiting strong connections between classical adaptive nonlinear control techniques and recent progress in optimization and machine learning, we show that there exists considerable untapped potential in algorithm development for both adaptive nonlinear control and adaptive dynamics prediction. We begin by introducing first-order adaptation laws inspired by natural gradient descent and mirror descent. We prove that when there are multiple dynamics consistent with the data, these non-Euclidean adaptation laws implicitly regularize the learned model. Local geometry imposed during learning thus may be used to select parameter vectors-out of the many that will achieve perfect tracking or prediction-for desired properties such as sparsity. We apply this result to regularized dynamics predictor and observer design, and as concrete examples, we consider Hamiltonian systems, Lagrangian systems, and recurrent neural networks. We subsequently develop a variational formalism based on the Bregman Lagrangian. We show that its Euler Lagrange equations lead to natural gradient and mirror descent-like adaptation laws with momentum, and we recover their first-order analogues in the infinite friction limit. We illustrate our analyses with simulations demonstrating our theoretical results.
تواريخ الأحداث: Date Created: 20210129 Date Completed: 20210823 Latest Revision: 20210823
رمز التحديث: 20240628
DOI: 10.1162/neco_a_01360
PMID: 33513321
قاعدة البيانات: MEDLINE
الوصف
تدمد:1530-888X
DOI:10.1162/neco_a_01360