论文标题
学习Koopman操作员的不变子空间 - 部分:一种用于演示字典的近似子空间不变性的方法
Learning Invariant Subspaces of Koopman Operators--Part 1: A Methodology for Demonstrating a Dictionary's Approximate Subspace Invariance
论文作者
论文摘要
Koopman运算符将非线性动力学模型作为作用于非线性函数作为状态的线性动态系统。这种非标准状态通常称为可观察到的koopman,通常通过从字典中绘制的函数叠加来数字上近似。在广泛使用的算法中,扩展动态模式分解,词典函数是从固定类别的函数类别中得出的。最近,深度学习与EDMD相结合已被用来通过称为“深度动态模式分解(DEEPDMD”)的算法学习新的字典函数。学到的表示(1)都可以准确地模型,并且(2)与原始非线性系统的尺寸相当良好。在本文中,我们从deepDMD分析了学到的词典,并探讨了其强劲性能的理论基础。我们探索包含州包含州的逻辑提升(SILL)词典功能,以近似Koopman可观察到。这些字典函数的错误分析表明它们满足子空间近似的属性,我们将其定义为统一有限近似闭合。我们的结果提供了一个假设,可以解释深度神经网络在学习数值近似值的成功量。本文的第2部分将通过证明异构词典的子空间不变,并提出深入和低参数异质词典学习的头对头数值比较,扩展了这一解释。
Koopman operators model nonlinear dynamics as a linear dynamic system acting on a nonlinear function as the state. This nonstandard state is often called a Koopman observable and is usually approximated numerically by a superposition of functions drawn from a dictionary. In a widely used algorithm, Extended Dynamic Mode Decomposition, the dictionary functions are drawn from a fixed class of functions. Recently, deep learning combined with EDMD has been used to learn novel dictionary functions in an algorithm called deep dynamic mode decomposition (deepDMD). The learned representation both (1) accurately models and (2) scales well with the dimension of the original nonlinear system. In this paper we analyze the learned dictionaries from deepDMD and explore the theoretical basis for their strong performance. We explore State-Inclusive Logistic Lifting (SILL) dictionary functions to approximate Koopman observables. Error analysis of these dictionary functions show they satisfy a property of subspace approximation, which we define as uniform finite approximate closure. Our results provide a hypothesis to explain the success of deep neural networks in learning numerical approximations to Koopman operators. Part 2 of this paper will extend this explanation by demonstrating the subspace invariant of heterogeneous dictionaries and presenting a head-to-head numerical comparison of deepDMD and low-parameter heterogeneous dictionary learning.