论文标题
通过可变密度noisier2noise使用子采样的自我监督MR图像重建的理论框架
A theoretical framework for self-supervised MR image reconstruction using sub-sampling via variable density Noisier2Noise
论文作者
论文摘要
近年来,人们一直在考虑利用神经网络的统计建模能力来重建亚采样磁共振成像(MRI)数据。大多数提出的方法假设存在代表性完全采样的数据集并使用完全监督的培训。但是,对于许多应用程序,没有完全采样的培训数据,并且可能非常不切实际。因此,对仅使用亚采样数据进行培训的自我监督方法的开发和理解是非常可取的。这项工作将noisier2noise框架扩展到了最初为自我监管的剥落任务构建的框架,并将其密度子采样的MRI数据扩展到可变的密度子采样。我们使用Noisier2Noise框架来分析解释通过数据不采样(SSDU)的自我监督学习的表现,这是一种最近提出的方法,在实践中表现良好,但直到现在一直缺乏理论上的理由。此外,我们提出了由于理论发展而产生的两种SSDU的修改。首先,我们建议对采样集进行分区,以使子集具有与原始采样蒙版相同的分布类型。其次,我们提出了一种减轻体重,以补偿采样和分配密度。在FastMRI数据集上,我们表明这些变化显着改善了SSDU的图像恢复质量和鲁棒性,对分区参数。
In recent years, there has been attention on leveraging the statistical modeling capabilities of neural networks for reconstructing sub-sampled Magnetic Resonance Imaging (MRI) data. Most proposed methods assume the existence of a representative fully-sampled dataset and use fully-supervised training. However, for many applications, fully sampled training data is not available, and may be highly impractical to acquire. The development and understanding of self-supervised methods, which use only sub-sampled data for training, are therefore highly desirable. This work extends the Noisier2Noise framework, which was originally constructed for self-supervised denoising tasks, to variable density sub-sampled MRI data. We use the Noisier2Noise framework to analytically explain the performance of Self-Supervised Learning via Data Undersampling (SSDU), a recently proposed method that performs well in practice but until now lacked theoretical justification. Further, we propose two modifications of SSDU that arise as a consequence of the theoretical developments. Firstly, we propose partitioning the sampling set so that the subsets have the same type of distribution as the original sampling mask. Secondly, we propose a loss weighting that compensates for the sampling and partitioning densities. On the fastMRI dataset we show that these changes significantly improve SSDU's image restoration quality and robustness to the partitioning parameters.