dc.creatorGevers, Michel
dc.creatorBazanella, Alexandre Sanfelice
dc.creatorBombois, Xavier
dc.creatorMiskovic, Ljubisa
dc.date2011-01-29T06:00:44Z
dc.date2009
dc.identifier0018-9286
dc.identifierhttp://hdl.handle.net/10183/27625
dc.identifier000736439
dc.descriptionIn prediction error identification, the information matrix plays a central role. Specifically, when the system is in the model set, the covariance matrix of the parameter estimates converges asymptotically, up to a scaling factor, to the inverse of the information matrix. The existence of a finite covariance matrix thus depends on the positive definiteness of the information matrix, and the rate of convergence of the parameter estimate depends on its “size”. The information matrix is also the key tool in the solution of optimal experiment design procedures, which have become a focus of recent attention. Introducing a geometric framework, we provide a complete analysis, for arbitrary model structures, of the minimum degree of richness required to guarantee the nonsingularity of the information matrix. We then particularize these results to all commonly used model structures, both in open loop and in closed loop. In a closed-loop setup, our results provide an unexpected and precisely quantifiable trade-off between controller degree and required degree of external excitation.
dc.formatapplication/pdf
dc.languageeng
dc.relationIEEE transactions on automatic control. New York. Vol. 54, no. 12 (Dec. 2009), p. 2828-2840
dc.rightsOpen Access
dc.subjectIdentifiability
dc.subjectInformation matrix
dc.subjectInput richness
dc.subjectTransfer of excitation
dc.subjectControle automático
dc.titleIdentification and the information matrix : how to get just sufficiently rich?
dc.typeArtigo de periódico
dc.typeEstrangeiro


Este ítem pertenece a la siguiente institución