Machine Learning

Sentence Ordering and Coherence Modeling using Recurrent Neural Networks

Tagged: , , ,

This topic contains 0 replies, has 1 voice, and was last updated by  arXiv 11 months, 3 weeks ago.


  • arXiv
    5 pts

    Sentence Ordering and Coherence Modeling using Recurrent Neural Networks

    Modeling the structure of coherent texts is a key NLP problem. The task of coherently organizing a given set of sentences has been commonly used to build and evaluate models that understand such structure. We propose an end-to-end unsupervised deep learning approach based on the set-to-sequence framework to address this problem. Our model strongly outperforms prior methods in the order discrimination task and a novel task of ordering abstracts from scientific articles. Furthermore, our work shows that useful text representations can be obtained by learning to order sentences. Visualizing the learned sentence representations shows that the model captures high-level logical structure in paragraphs. Our representations perform comparably to state-of-the-art pre-training methods on sentence similarity and paraphrase detection tasks.

    Sentence Ordering and Coherence Modeling using Recurrent Neural Networks
    by Lajanugen Logeswaran, Honglak Lee, Dragomir Radev
    https://arxiv.org/pdf/1611.02654v2.pdf

You must be logged in to reply to this topic.