论文标题

Belnet:基础增强学习,无网格的神经操作员

BelNet: Basis enhanced learning, a mesh-free neural operator

论文作者

Zhang, Zecheng, Leung, Wing Tat, Schaeffer, Hayden

论文摘要

操作员学习训练神经网络以将功能映射到功能。理想的操作员学习框架应无网格,因为训练不需要特定的输入功能选择,允许输入和输出功能在不同的域上,并且能够在样本之间具有不同的网格。我们提出了一个无网神经操作员,以求解参数偏微分方程。基础增强学习网络(BELNET)将输入功能投射到潜在空间中,并重建输出功能。特别是,我们构建了网络的一部分,以在培训过程中学习``基础''功能。这概括了Chen和Chen的通用近似理论中提出的网络,供非线性操作员解释输入和输出网格的差异。通过几个具有挑战性的高对比度和多尺度问题,我们表明我们的方法在这些任务中的其他操作员学习方法的表现优于其他操作员的学习方法,并且可以在抽样和/或离散化过程中获得更多的自由。

Operator learning trains a neural network to map functions to functions. An ideal operator learning framework should be mesh-free in the sense that the training does not require a particular choice of discretization for the input functions, allows for the input and output functions to be on different domains, and is able to have different grids between samples. We propose a mesh-free neural operator for solving parametric partial differential equations. The basis enhanced learning network (BelNet) projects the input function into a latent space and reconstructs the output functions. In particular, we construct part of the network to learn the ``basis'' functions in the training process. This generalized the networks proposed in Chen and Chen's universal approximation theory for the nonlinear operators to account for differences in input and output meshes. Through several challenging high-contrast and multiscale problems, we show that our approach outperforms other operator learning methods for these tasks and allows for more freedom in the sampling and/or discretization process.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源