A neural network model for low-resource universal dependency parsing

Long Duong, Trevor Cohn, Steven Bird, Paul Cook

Research output: Chapter in Book/Report/Conference proceedingConference Paper published in Proceedings

Abstract

Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared "universal" parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10% improvement across several treebanks in low-resource simulations.

Original languageEnglish
Title of host publicationConference Proceedings - EMNLP 2015
Subtitle of host publicationConference on Empirical Methods in Natural Language Processing
Place of PublicationLisbon, Portugal
PublisherAssociation for Computational Linguistics (ACL)
Pages339-348
Number of pages10
ISBN (Electronic)9781941643327
Publication statusPublished - 2015
Externally publishedYes
EventConference on Empirical Methods in Natural Language Processing, EMNLP 2015 - Lisbon, Portugal
Duration: 17 Sep 201521 Sep 2015

Conference

ConferenceConference on Empirical Methods in Natural Language Processing, EMNLP 2015
CountryPortugal
CityLisbon
Period17/09/1521/09/15

Fingerprint Dive into the research topics of 'A neural network model for low-resource universal dependency parsing'. Together they form a unique fingerprint.

  • Cite this

    Duong, L., Cohn, T., Bird, S., & Cook, P. (2015). A neural network model for low-resource universal dependency parsing. In Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing (pp. 339-348). Association for Computational Linguistics (ACL).