论文标题

PlotMachines:具有动态图状态跟踪的轮廓条件生成

PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking

论文作者

Rashkin, Hannah, Celikyilmaz, Asli, Choi, Yejin, Gao, Jianfeng

论文摘要

我们提出了大纲的故事生成的任务:将大纲作为一组短语,描述了要出现在故事​​中的关键角色和事件,其任务是生成与提供大纲一致的连贯叙述。此任务具有挑战性,因为输入只提供了图的粗略草图,因此,模型需要通过交织大纲中提供的关键点来生成故事。这需要模型来跟踪潜在图的动态状态,并在生成全部故事的同时,在输入轮廓上进行条件。我们提出了PlotMachines,这是一种神经叙事模型,通过跟踪动态情节陈述,学会将大纲转变为连贯的故事。此外,我们以高级话语结构丰富了Plotmachines,以便模型可以学习与叙事不同部分相对应的不同写作样式。三个小说和非小说数据集的全面实验表明,尽管具有令人印象深刻的生成性能,例如GPT-2和Grover之类的大型语言模型在给定轮廓的产生相干叙述方面不足,而动态的情节跟踪对于更紧密的叙事而言是重要的,更加一致。

We propose the task of outline-conditioned story generation: given an outline as a set of phrases that describe key characters and events to appear in a story, the task is to generate a coherent narrative that is consistent with the provided outline. This task is challenging as the input only provides a rough sketch of the plot, and thus, models need to generate a story by interweaving the key points provided in the outline. This requires the model to keep track of the dynamic states of the latent plot, conditioning on the input outline while generating the full story. We present PlotMachines, a neural narrative model that learns to transform an outline into a coherent story by tracking the dynamic plot states. In addition, we enrich PlotMachines with high-level discourse structure so that the model can learn different writing styles corresponding to different parts of the narrative. Comprehensive experiments over three fiction and non-fiction datasets demonstrate that large-scale language models, such as GPT-2 and Grover, despite their impressive generation performance, are not sufficient in generating coherent narratives for the given outline, and dynamic plot state tracking is important for composing narratives with tighter, more consistent plots.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源