论文标题
$α$ qboost:迭代加权的绝热训练分类器
$α$QBoost: An Iteratively Weighted Adiabatic Trained Classifier
论文作者
论文摘要
得出了经过绝热训练的合奏模型的新实现,该模型显示出对经典方法的显着改善。尤其是,这种新算法的经验结果表明,它不仅提供更高的性能,而且提供了更少的分类器的稳定性,这种属性在解释性和推断速度等领域至关重要。总的来说,经验分析表明,该算法可以通过进一步最大程度地减少和平衡差异和偏见来增强统计模型的稳定性,同时减少对其前任的收敛时间,从而提供了看不见数据的性能。
A new implementation of an adiabatically-trained ensemble model is derived that shows significant improvements over classical methods. In particular, empirical results of this new algorithm show that it offers not just higher performance, but also more stability with less classifiers, an attribute that is critically important in areas like explainability and speed-of-inference. In all, the empirical analysis displays that the algorithm can provide an increase in performance on unseen data by strengthening stability of the statistical model through further minimizing and balancing variance and bias, while decreasing the time to convergence over its predecessors.