论文标题

数据驱动的反馈线性化使用Koopman Generator

Data-Driven Feedback Linearization using the Koopman Generator

论文作者

Gadginmath, Darshan, Krishnan, Vishaal, Pasqualetti, Fabio

论文摘要

本文为非线性控制膜系统的数据驱动反馈线性化贡献了一个理论框架。我们通过涉及Koopman操作员的操作者理论观点来统一有关反馈线性化的传统几何观点。我们首先表明,如果控制矢量场的分布及其与漂移矢量场的重复式括号是涉及的,则存在一个输出,而反馈控制定律则是Koopman Generator是有限的二维且本地nilpotent的。我们使用此连接来提出一个数据驱动的算法Koopman Generator基于基于反馈线性化的反馈线性化(KGFL)。特别是,我们使用实验数据来确定状态转换和控制函数词典的反馈,从而在最小二乘意义上实现了反馈线性化。我们还提出了一个单步数据驱动的公式,可用于计算线性化转换。当系统是可线化的反馈并且所选词典完成时,我们的数据驱动算法提供了与基于模型的反馈线性化相同的解决方案。最后,我们提供了数据驱动算法的数值示例,并将其与基于模型的反馈线性化进行比较。我们还在数字上研究了字典的丰富性和数据集的大小对反馈线性化有效性的影响。

This paper contributes a theoretical framework for data-driven feedback linearization of nonlinear control-affine systems. We unify the traditional geometric perspective on feedback linearization with an operator-theoretic perspective involving the Koopman operator. We first show that if the distribution of the control vector field and its repeated Lie brackets with the drift vector field is involutive, then there exists an output and a feedback control law for which the Koopman generator is finite-dimensional and locally nilpotent. We use this connection to propose a data-driven algorithm Koopman Generator-based Feedback Linearization (KGFL) for feedback linearization. Particularly, we use experimental data to identify the state transformation and control feedback from a dictionary of functions for which feedback linearization is achieved in a least-squares sense. We also propose a single-step data-driven formula which can be used to compute the linearizing transformations. When the system is feedback linearizable and the chosen dictionary is complete, our data-driven algorithm provides the same solution as model-based feedback linearization. Finally, we provide numerical examples for the data-driven algorithm and compare it with model-based feedback linearization. We also numerically study the effect of the richness of the dictionary and the size of the data set on the effectiveness of feedback linearization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源