论文标题

BitWidth-Aptive量化感知的神经网络培训:一种元学习方法

Bitwidth-Adaptive Quantization-Aware Neural Network Training: A Meta-Learning Approach

论文作者

Youn, Jiseok, Song, Jaehun, Kim, Hyung-Sin, Bahk, Saewoong

论文摘要

由于在具有不同资源预算的各种平台上的模型部署方便,深层神经网络量化和自适应位量已经引起了人们的关注。在本文中,我们提出了一种实现这一目标的元学习方法。具体而言,我们提出了MEBQAT,这是一种简单而有效的自适应量化意识训练(QAT)的方法,其中通过重新定义元学习任务来有效地与QAT结合了元学习,以结合位宽。部署在平台上后,MEBQAT允许将(Meta-)训练的模型量化为任何候选刻度,然后有助于进行推理,而不会从量化中准确地下降。此外,通过一些学习方案,MEBQAT还可以通过添加常规优化或基于指标的元学习来使模型以及任何看不见的目标类别调整模型。我们设计了MEBQAT的变体,以支持(1)(1)位置自适应量化方案和(2)新的几次学习方案,在该方案中,量化位宽度和目标类都是共同调整的。我们通过实验证明了它们在多个QAT方案中的有效性。通过将它们的性能与(Bitwidth-dedicatienated)QAT,现有的Bitwidth自适应QAT和Vanilla Meta-Learning进行比较,我们发现将Bitwidths合并到元学习任务中可以达到更高的鲁棒性。

Deep neural network quantization with adaptive bitwidths has gained increasing attention due to the ease of model deployment on various platforms with different resource budgets. In this paper, we propose a meta-learning approach to achieve this goal. Specifically, we propose MEBQAT, a simple yet effective way of bitwidth-adaptive quantization aware training (QAT) where meta-learning is effectively combined with QAT by redefining meta-learning tasks to incorporate bitwidths. After being deployed on a platform, MEBQAT allows the (meta-)trained model to be quantized to any candidate bitwidth then helps to conduct inference without much accuracy drop from quantization. Moreover, with a few-shot learning scenario, MEBQAT can also adapt a model to any bitwidth as well as any unseen target classes by adding conventional optimization or metric-based meta-learning. We design variants of MEBQAT to support both (1) a bitwidth-adaptive quantization scenario and (2) a new few-shot learning scenario where both quantization bitwidths and target classes are jointly adapted. We experimentally demonstrate their validity in multiple QAT schemes. By comparing their performance to (bitwidth-dedicated) QAT, existing bitwidth adaptive QAT and vanilla meta-learning, we find that merging bitwidths into meta-learning tasks achieves a higher level of robustness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源