Conditions for Convergence of Dynamic Regressor Extension and Mixing Parameter Estimators Using LTI Filters

Publisher:
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Publication Type:
Journal Article
Citation:
IEEE Transactions on Automatic Control, 2023, 68, (2), pp. 1253-1258
Issue Date:
2023-02-01
Full metadata record
In this article, we study the conditions for convergence of the recently introduced dynamic regressor extension and mixing (DREM) parameter estimator when the extended regressor is generated using linear time-invariant filters. In particular, we are interested in relating these conditions with the ones required for convergence of the classical gradient (or least squares), namely the well-known persistent excitation (PE) requirement on the original regressor vector, φ (t)\in Rq, with q\in N the number of unknown parameters. Moreover, we study the case when only interval excitation (IE) is available, under which DREM, concurrent, and composite learning schemes ensure global convergence, being the convergence for DREM in a finite time. Regarding PE, we prove, under some mild technical assumptions, that if φ is PE, then the scalar regressor of DREM, Δ N\in R, is also PE ensuring exponential convergence. Concerning IE, we prove that if φ is IE, then Δ N is also IE. All these results are established in the almost sure sense, namely proving that the set of filter parameters for which the claims do not hold is of zero measure. The main technical tool used in our proof is inspired by a study of Luenberger observers for nonautonomous nonlinear systems recently reported in the literature.
Please use this identifier to cite or link to this item: