论文标题

亚里士多德夺回神经故事生成的内容计划

Content Planning for Neural Story Generation with Aristotelian Rescoring

论文作者

Goldfarb-Tarrant, Seraphina, Chakrabarty, Tuhin, Weischedel, Ralph, Peng, Nanyun

论文摘要

大语言模型产生的长形叙事文本管理着流利的人类写作模仿,但仅在当地句子级别,并且缺乏结构或全球凝聚力。我们认为,故事产生的许多问题可以通过高质量的内容计划来解决,并提出了一个专注于如何学习良好情节结构以指导故事生成的系统。我们利用情节生成的语言模型以及重新夺回模型的合奏,每个模型都在亚里士多德的诗学中详细介绍了一个良好故事写作的方面。我们发现,与不满意计划的基线相比,用我们原则性的情节结构编写的故事与给定的及时质量和更高的质量更相关,或者以无原则的方式计划。

Long-form narrative text generated from large language models manages a fluent impersonation of human writing, but only at the local sentence level, and lacks structure or global cohesion. We posit that many of the problems of story generation can be addressed via high-quality content planning, and present a system that focuses on how to learn good plot structures to guide story generation. We utilize a plot-generation language model along with an ensemble of rescoring models that each implement an aspect of good story-writing as detailed in Aristotle's Poetics. We find that stories written with our more principled plot-structure are both more relevant to a given prompt and higher quality than baselines that do not content plan, or that plan in an unprincipled way.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源