Navigation überspringen
Universitätsbibliothek Heidelberg
Status: Bibliographieeintrag
Standort: ---
Exemplare: ---

+ Andere Auflagen/Ausgaben
heiBIB
 Online-Ressource
Verfasst von:Paul, Debjit [VerfasserIn]   i
 Frank, Anette [VerfasserIn]   i
Titel:COINS: dynamically generating COntextualized Inference rules for Narrative Story completion
Verf.angabe:Debjit Paul, Anette Frank
E-Jahr:2021
Jahr:August 2021
Umfang:14 S.
Fussnoten:Gesehen am 10.07.2023
Titel Quelle:Enthalten in: ACL-IJCNLP (2021 : Online)The 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing - proceedings of the conference ; Vol. 1: Long papers
Ort Quelle:Stroudsburg, PA : Association for Computational Linguistics (ACL), 2021
Jahr Quelle:2021
Band/Heft Quelle:(2021) vom: Aug., Seite 5086-5099
ISBN Quelle:978-1-954085-52-7
Abstract:Despite recent successes of large pre-trained language models in solving reasoning tasks, their inference capabilities remain opaque. We posit that such models can be made more interpretable by explicitly generating interim inference rules, and using them to guide the generation of task-specific textual outputs. In this paper we present Coins, a recursive inference framework that i) iteratively reads context sentences, ii) dynamically generates contextualized inference rules, encodes them, and iii) uses them to guide task-specific output generation. We apply to a Narrative Story Completion task that asks a model to complete a story with missing sentences, to produce a coherent story with plausible logical connections, causal relationships, and temporal dependencies. By modularizing inference and sentence generation steps in a recurrent model, we aim to make reasoning steps and their effects on next sentence generation transparent. Our automatic and manual evaluations show that the model generates better story sentences than SOTA baselines, especially in terms of coherence. We further demonstrate improved performance over strong pre-trained LMs in generating commonsense inference rules. The recursive nature of holds the potential for controlled generation of longer sequences.
DOI:doi:10.18653/v1/2021.acl-long.395
URL:Bitte beachten Sie: Dies ist ein Bibliographieeintrag. Ein Volltextzugriff für Mitglieder der Universität besteht hier nur, falls für die entsprechende Zeitschrift/den entsprechenden Sammelband ein Abonnement besteht oder es sich um einen OpenAccess-Titel handelt.

Volltext: https://doi.org/10.18653/v1/2021.acl-long.395
 Volltext: https://aclanthology.org/2021.acl-long.395
 DOI: https://doi.org/10.18653/v1/2021.acl-long.395
Datenträger:Online-Ressource
Sprache:eng
Bibliogr. Hinweis:Forschungsdaten: Paul, Debjit: Source code, data and additional material for the thesis: "Social commonsense reasoning with structured knowledge in text"
K10plus-PPN:1852304804
Verknüpfungen:→ Sammelwerk

Permanenter Link auf diesen Titel (bookmarkfähig):  https://katalog.ub.uni-heidelberg.de/titel/69095089   QR-Code
zum Seitenanfang