A neural network model for low-resource universal dependency parsing

Long Duong, Trevor Cohn, Steven Bird, Paul Cook

Research output: Chapter in Book/Report/Conference proceedingConference Paper published in ProceedingsResearchpeer-review

Abstract

Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared "universal" parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10% improvement across several treebanks in low-resource simulations.

Original languageEnglish
Title of host publicationConference Proceedings - EMNLP 2015
Subtitle of host publicationConference on Empirical Methods in Natural Language Processing
Place of PublicationLisbon, Portugal
PublisherAssociation for Computational Linguistics (ACL)
Pages339-348
Number of pages10
ISBN (Electronic)9781941643327
Publication statusPublished - 2015
Externally publishedYes
EventConference on Empirical Methods in Natural Language Processing, EMNLP 2015 - Lisbon, Portugal
Duration: 17 Sep 201521 Sep 2015

Conference

ConferenceConference on Empirical Methods in Natural Language Processing, EMNLP 2015
CountryPortugal
CityLisbon
Period17/09/1521/09/15

Fingerprint

Supervised learning
Neural networks

Cite this

Duong, L., Cohn, T., Bird, S., & Cook, P. (2015). A neural network model for low-resource universal dependency parsing. In Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing (pp. 339-348). Lisbon, Portugal: Association for Computational Linguistics (ACL).
Duong, Long ; Cohn, Trevor ; Bird, Steven ; Cook, Paul. / A neural network model for low-resource universal dependency parsing. Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal : Association for Computational Linguistics (ACL), 2015. pp. 339-348
@inproceedings{f3cc60e8135448c7b4ce3c7144168468,
title = "A neural network model for low-resource universal dependency parsing",
abstract = "Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared {"}universal{"} parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10{\%} improvement across several treebanks in low-resource simulations.",
author = "Long Duong and Trevor Cohn and Steven Bird and Paul Cook",
year = "2015",
language = "English",
pages = "339--348",
booktitle = "Conference Proceedings - EMNLP 2015",
publisher = "Association for Computational Linguistics (ACL)",

}

Duong, L, Cohn, T, Bird, S & Cook, P 2015, A neural network model for low-resource universal dependency parsing. in Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics (ACL), Lisbon, Portugal, pp. 339-348, Conference on Empirical Methods in Natural Language Processing, EMNLP 2015, Lisbon, Portugal, 17/09/15.

A neural network model for low-resource universal dependency parsing. / Duong, Long; Cohn, Trevor; Bird, Steven; Cook, Paul.

Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal : Association for Computational Linguistics (ACL), 2015. p. 339-348.

Research output: Chapter in Book/Report/Conference proceedingConference Paper published in ProceedingsResearchpeer-review

TY - GEN

T1 - A neural network model for low-resource universal dependency parsing

AU - Duong, Long

AU - Cohn, Trevor

AU - Bird, Steven

AU - Cook, Paul

PY - 2015

Y1 - 2015

N2 - Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared "universal" parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10% improvement across several treebanks in low-resource simulations.

AB - Accurate dependency parsing requires large treebanks, which are only available for a few languages. We propose a method that takes advantage of shared structure across languages to build a mature parser using less training data. We propose a model for learning a shared "universal" parser that operates over an interlingual continuous representation of language, along with language-specific mapping components. Compared with supervised learning, our methods give a consistent 8-10% improvement across several treebanks in low-resource simulations.

UR - http://www.scopus.com/inward/record.url?scp=84959925904&partnerID=8YFLogxK

UR - http://www.emnlp2015.org/proceedings/EMNLP/

M3 - Conference Paper published in Proceedings

SP - 339

EP - 348

BT - Conference Proceedings - EMNLP 2015

PB - Association for Computational Linguistics (ACL)

CY - Lisbon, Portugal

ER -

Duong L, Cohn T, Bird S, Cook P. A neural network model for low-resource universal dependency parsing. In Conference Proceedings - EMNLP 2015: Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal: Association for Computational Linguistics (ACL). 2015. p. 339-348