论文标题

强大的主动蒸馏

Robust Active Distillation

论文作者

Baykal, Cenk, Trinh, Khoa, Iliopoulos, Fotis, Menghani, Gaurav, Vee, Erik

论文摘要

将知识从大型教师模型蒸馏到轻质模型是一种在半监督学习设置中产生紧凑,强大的模型的广泛成功方法,在该设置中,有限的标记数据可用。但是,在大规模应用中,老师倾向于提供大量不正确的软标签,以损害学生的表现。教师的庞大规模还限制了由于计算和/或财务成本过高而可以查询的软标签的数量。在实现同时\ emph {效率}(即,最小化软标签查询)和\ emph {rotustness}(即避免由于标签不正确而导致的学生不准确)的困难会损害知识蒸馏到许多现代任务。在本文中,我们提出了一种无参数的方法,并可以证明保证可以查询同时提供信息且由老师正确标记的点的软标记。我们作品的核心是游戏理论表述,该公式明确考虑了输入实例的信息和正确性之间的固有权衡。我们建立了即使在最坏的蒸馏实例中达到的预期表现的界限。我们介绍了对流行基准测试的经验评估,这些评估证明了我们的工作相对于最先进的主动学习和主动蒸馏方法的蒸馏性能的改善。

Distilling knowledge from a large teacher model to a lightweight one is a widely successful approach for generating compact, powerful models in the semi-supervised learning setting where a limited amount of labeled data is available. In large-scale applications, however, the teacher tends to provide a large number of incorrect soft-labels that impairs student performance. The sheer size of the teacher additionally constrains the number of soft-labels that can be queried due to prohibitive computational and/or financial costs. The difficulty in achieving simultaneous \emph{efficiency} (i.e., minimizing soft-label queries) and \emph{robustness} (i.e., avoiding student inaccuracies due to incorrect labels) hurts the widespread application of knowledge distillation to many modern tasks. In this paper, we present a parameter-free approach with provable guarantees to query the soft-labels of points that are simultaneously informative and correctly labeled by the teacher. At the core of our work lies a game-theoretic formulation that explicitly considers the inherent trade-off between the informativeness and correctness of input instances. We establish bounds on the expected performance of our approach that hold even in worst-case distillation instances. We present empirical evaluations on popular benchmarks that demonstrate the improved distillation performance enabled by our work relative to that of state-of-the-art active learning and active distillation methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源