论文标题

强调二进制分类中假否定性或误报的最小化

Emphasis on the Minimization of False Negatives or False Positives in Binary Classification

论文作者

Singh, Sanskriti

论文摘要

随着人类开始将更多的机器学习实施到当前产品时,二进制分类中特定案例的最小化越来越重要。尽管有几种方法可以偏向于减少特定情况,但这些方法并不是很有效,因此它们在模型中的使用最少。为此,引入了一种新方法,以减少错误的负面因素或假阳性,而不会大大改变模型的整体性能或F1分数。此方法涉及预先训练模型后,仔细更改输入的实际值。介绍该方法应用于各种数据集的结果,有些比其他数据集更为复杂。通过对这些数据集的多个模型体系结构进行实验,找到了最佳模型。在所有模型中,召回或精度的提高,分别为假否定因素或假阳性的最小化,没有大幅下降的F1分数下降。

The minimization of specific cases in binary classification, such as false negatives or false positives, grows increasingly important as humans begin to implement more machine learning into current products. While there are a few methods to put a bias towards the reduction of specific cases, these methods aren't very effective, hence their minimal use in models. To this end, a new method is introduced to reduce the False Negatives or False positives without drastically changing the overall performance or F1 score of the model. This method involving the careful change to the real value of the input after pre-training the model. Presenting the results of this method being applied on various datasets, some being more complex than others. Through experimentation on multiple model architectures on these datasets, the best model was found. In all the models, an increase in the recall or precision, minimization of False Negatives or False Positives respectively, was shown without a large drop in F1 score.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源