论文标题

通过擦除学习:基于条件熵的可转移分布检测

Learning by Erasing: Conditional Entropy based Transferable Out-Of-Distribution Detection

论文作者

Xing, Meng, Feng, Zhiyong, Su, Yong, Oh, Changjae

论文摘要

分布外(OOD)检测对于处理训练和测试方案之间的分布变化至关重要。对于新的分布(ID)数据集,现有方法需要重新培训才能捕获特定于数据集的功能表示或数据分发。在本文中,我们提出了基于可转移的OOD检测方法的深入生成模型(DGM),这是不需要在新的ID数据集上重新训练的。我们设计了一个图像擦除策略,以为每个ID数据集配置独家条件熵分布,这决定了不同ID数据集上DGM的后验分布的差异。由于卷积神经网络的强大表示能力,在复杂数据集中训练的拟议模型可以捕获ID数据集之间的上述差异而无需重新训练,从而实现了可转移的OOD检测。我们验证了五个数据集和真实性的提议方法,我们的性能与基于最先进的组的OOD检测方法相当,这些方法需要重新训练才能在新的ID数据集中部署。我们的代码可在https://github.com/oohcioo/cetood上找到。

Out-of-distribution (OOD) detection is essential to handle the distribution shifts between training and test scenarios. For a new in-distribution (ID) dataset, existing methods require retraining to capture the dataset-specific feature representation or data distribution. In this paper, we propose a deep generative models (DGM) based transferable OOD detection method, which is unnecessary to retrain on a new ID dataset. We design an image erasing strategy to equip exclusive conditional entropy distribution for each ID dataset, which determines the discrepancy of DGM's posteriori ucertainty distribution on different ID datasets. Owing to the powerful representation capacity of convolutional neural networks, the proposed model trained on complex dataset can capture the above discrepancy between ID datasets without retraining and thus achieve transferable OOD detection. We validate the proposed method on five datasets and verity that ours achieves comparable performance to the state-of-the-art group based OOD detection methods that need to be retrained to deploy on new ID datasets. Our code is available at https://github.com/oOHCIOo/CETOOD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源