论文标题
有机地和大规模软化在线极端
Softening online extremes organically and at scale
论文作者
论文摘要
呼吁社交媒体平台升级,以减轻极端的在线社区,这些人的观点可能导致现实世界中的危害,例如,误/虚假和不信任和不信任,增加了Covid-19的死亡,现在扩展到Monkeypox,不安全的婴儿公式,癌症,癌症,疾病和气候变化;白色替代品激发了2022年的布法罗射手,并可能会激发其他人的灵感。威胁选举的愤怒,例如2021年美国国会大厦袭击;鼓励虐待妇女的男性至高无上的观念;反犹太主义,反LGBQT仇恨和Qanon阴谋。但是,“做更多”意味着要做更多的事情,还是不同的事情?如果是这样,什么?在这里,我们首先展示为什么平台做更多的工作将无法解决问题。具体而言,我们对近1亿个Facebook用户纠缠在疫苗上的分析,如今,Covid及以后的分析表明,极端社区的生态学对Facebook的删除干预措施具有隐藏的弹性; Facebook的消息传递干预措施缺少关键的受众领域并被嘲笑;这些在线极端叙事中的关键部分被错误地标记为不正确的科学。审查制度的威胁是煽动在其他潜在的受众群体上的其他平台上创建并行的存在。然后,我们从经验上演示了一种新的解决方案,可以有机地软化在线极端,而无需审查或删除社区或其内容,或检查或正确的事实,或促进任何预防性消息传递或寻求共识。该解决方案可以在社交媒体平台上以较低的成本进行大规模自动化。
Calls are escalating for social media platforms to do more to mitigate extreme online communities whose views can lead to real-world harms, e.g., mis/disinformation and distrust that increased Covid-19 fatalities, and now extend to monkeypox, unsafe baby formula alternatives, cancer, abortions, and climate change; white replacement that inspired the 2022 Buffalo shooter and will likely inspire others; anger that threatens elections, e.g., 2021 U.S. Capitol attack; notions of male supremacy that encourage abuse of women; anti-Semitism, anti-LGBQT hate and QAnon conspiracies. But should 'doing more' mean doing more of the same, or something different? If so, what? Here we start by showing why platforms doing more of the same will not solve the problem. Specifically, our analysis of nearly 100 million Facebook users entangled over vaccines and now Covid and beyond, shows that the extreme communities' ecology has a hidden resilience to Facebook's removal interventions; that Facebook's messaging interventions are missing key audience sectors and getting ridiculed; that a key piece of these online extremes' narratives is being mislabeled as incorrect science; and that the threat of censorship is inciting the creation of parallel presences on other platforms with potentially broader audiences. We then demonstrate empirically a new solution that can soften online extremes organically without having to censor or remove communities or their content, or check or correct facts, or promote any preventative messaging, or seek a consensus. This solution can be automated at scale across social media platforms quickly and with minimal cost.