How to train your multi bottom-up tree transducer
Maletti, Andreas

Article Structure

Abstract

The local multi bottom-up tree transducer is introduced and related to the (noncontiguous) synchronous tree sequence substitution grammar.

Introduction

A (formal) translation model is at the core of every machine translation system.

Notation

The set of nonnegative integers is N. We write for the set | l g i g We treat functions as special relations.

The model

In this section, we recall particular multi bottom-up tree transducers, which have been introduced by Arnold and Dauchet (1982) and Lilin (1981).

Rule extraction and training

In this section, we will show how to automatically obtain an LMBOT from a bi-parsed, word-aligned parallel corpus.

Preservation of regularity

Clearly, LMB OT are not symmetric.

Conclusion

We have introduced a simple restriction of multi bottom-up tree transducers.

Topics

translation models

Appears in 5 sentences as: translation model (2) translation models (3)
In How to train your multi bottom-up tree transducer
  1. A (formal) translation model is at the core of every machine translation system.
    Page 1, “Introduction”
  2. (1990) discuss automatically trainable translation models in their seminal paper.
    Page 1, “Introduction”
  3. Contrary, in the field of syntax-based machine translation, the translation models have full access to the syntax of the sentences and can base their decision on it.
    Page 1, “Introduction”
  4. In this paper, we deal exclusively with syntax-based translation models such as synchronous tree substitution grammars (STS G), multi bottom-up tree transducers (MB OT), and synchronous tree-sequence substitution grammars (STSSG).
    Page 1, “Introduction”
  5. Preservation of regularity is an important property for a number of translation model manipulations.
    Page 7, “Preservation of regularity”

See all papers in Proc. ACL 2011 that mention translation models.

See all papers in Proc. ACL that mention translation models.

Back to top.

machine translation

Appears in 4 sentences as: machine translation (4)
In How to train your multi bottom-up tree transducer
  1. A (formal) translation model is at the core of every machine translation system.
    Page 1, “Introduction”
  2. Contrary, in the field of syntax-based machine translation , the translation models have full access to the syntax of the sentences and can base their decision on it.
    Page 1, “Introduction”
  3. In this contribution, we restrict MBOT to a form that is particularly relevant in machine translation .
    Page 1, “Introduction”
  4. (2008) argue that STSG have sufficient expressive power for syntax-based machine translation , but Zhang et al.
    Page 5, “The model”

See all papers in Proc. ACL 2011 that mention machine translation.

See all papers in Proc. ACL that mention machine translation.

Back to top.

subtrees

Appears in 3 sentences as: subtrees (3)
In How to train your multi bottom-up tree transducer
  1. We also need a substitution that replaces subtrees .
    Page 2, “Notation”
  2. Then t[pi <— 75, | l g i g n] denotes the tree that is obtained from t by replacing (in parallel) the subtrees at pi by ti for every 2' E
    Page 2, “Notation”
  3. This is necessary because those two mentioned subtrees must reproduce t1 and 752 from the end of the ‘X’-chain.
    Page 8, “Preservation of regularity”

See all papers in Proc. ACL 2011 that mention subtrees.

See all papers in Proc. ACL that mention subtrees.

Back to top.