An alternative method of training probabilistic LR parsers

4Citations
Citations of this article
76Readers
Mendeley users who have this article in their library.

Abstract

We discuss existing approaches to train LR parsers, which have been used for statistical resolution of structural ambiguity. These approaches are non-optimal, in the sense that a collection of probability distributions cannot be obtained. In particular, some probability distributions expressible in terms of a context-free grammar cannot be expressed in terms of the LR parser constructed from that grammar, under the restrictions of the existing approaches to training of LR parsers. We present an alternative way of training that is provably optimal, and that allows all probability distributions expressible in the context-free grammar to be carried over to the LR parser. We also demonstrate empirically that this kind of training can be effectively applied on a large treebank.

References Powered by Scopus

Deterministic techniques for efficient non-deterministic parsers

99Citations
N/AReaders
Get full text

Consistency of stochastic context-free grammars from probabilistic estimation based on growth transformations

34Citations
N/AReaders
Get full text

Probabilistic grammars and automata

26Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Probabilistic parsing strategies

10Citations
N/AReaders
Get full text

Probabilistic parsing

8Citations
N/AReaders
Get full text

A derivational model of discontinuous parsing

1Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Nederhof, M. J., & Satta, G. (2004). An alternative method of training probabilistic LR parsers. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 550–557). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1218955.1219025

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 24

60%

Researcher 8

20%

Professor / Associate Prof. 6

15%

Lecturer / Post doc 2

5%

Readers' Discipline

Tooltip

Computer Science 31

78%

Linguistics 7

18%

Neuroscience 1

3%

Social Sciences 1

3%

Save time finding and organizing research with Mendeley

Sign up for free