论文标题

概率分类器输出的性能 - 不合稳定融合

Performance-Agnostic Fusion of Probabilistic Classifier Outputs

论文作者

Masakuna, Jordan F., Utete, Simukai W., Kroon, Steve

论文摘要

我们提出了一种结合分类器的概率输出的方法,以在没有有关单个分类器的更多信息时进行单个共识类预测,除了他们已经接受了相同任务的培训。缺乏相关的先前信息排除了贝叶斯或dempster-shafer方法的典型应用,以及此处的默认方法将是基于冷漠原理的方法,例如总和或产品规则,本质上平等地对所有分类器进行了加权。相比之下,我们的方法考虑了各种分类器的输出之间的多样性,迭代地基于其与其他预测的对应关系更新预测,直到预测融合到共识决策为止。这种方法背后的直觉是,经过同一任务培训的分类器通常应在其新任务的输出中表现出规律性。因此,使用我们的方法给予分类器的预测与他人的预测显着差异。该方法隐含地假定对称损失函数,因为未考虑各种预测错误的相对成本。该模型的性能在不同的基准数据集上显示。我们提出的方法在准确性是性能指标的情况下效果很好;但是,它不会输出校准的概率,因此在需要进一步处理需要此类概率的情况下,它不适合。

We propose a method for combining probabilistic outputs of classifiers to make a single consensus class prediction when no further information about the individual classifiers is available, beyond that they have been trained for the same task. The lack of relevant prior information rules out typical applications of Bayesian or Dempster-Shafer methods, and the default approach here would be methods based on the principle of indifference, such as the sum or product rule, which essentially weight all classifiers equally. In contrast, our approach considers the diversity between the outputs of the various classifiers, iteratively updating predictions based on their correspondence with other predictions until the predictions converge to a consensus decision. The intuition behind this approach is that classifiers trained for the same task should typically exhibit regularities in their outputs on a new task; the predictions of classifiers which differ significantly from those of others are thus given less credence using our approach. The approach implicitly assumes a symmetric loss function, in that the relative cost of various prediction errors are not taken into account. Performance of the model is demonstrated on different benchmark datasets. Our proposed method works well in situations where accuracy is the performance metric; however, it does not output calibrated probabilities, so it is not suitable in situations where such probabilities are required for further processing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源