Jointly Optimizing a Two-Step Conditional Random Field Model for Machine Transliteration and Its Fast Decoding Algorithm
Yang, Dong and Dixon, Paul and Furui, Sadaoki

Article Structure

Abstract

This paper presents a joint optimization method of a two-step conditional random field (CRF) model for machine transliteration and a fast decoding algorithm for the proposed method.

Introduction

There are more than 6000 languages in the world and 10 languages of them have more than 100 million native speakers.

Two-step CRF method

2.1 CRF introduction

Joint optimization and its fast decoding algorithm

3.1 Joint optimization

Rapid development of a J SCM system

The J SCM represents how the source words and target names are generated simultaneously (Li et al., 2004):

Experiments

We use several metrics from (Li et al., 2009) to measure the performance of our system.

Conclusions and future work

In this paper we have presented our new joint optimization method for a two-step CRF model and its fast decoding algorithm.

Topics

CRF

Appears in 34 sentences as: CRF (41) CRF” (1)
In Jointly Optimizing a Two-Step Conditional Random Field Model for Machine Transliteration and Its Fast Decoding Algorithm
  1. This paper presents a joint optimization method of a two-step conditional random field ( CRF ) model for machine transliteration and a fast decoding algorithm for the proposed method.
    Page 1, “Abstract”
  2. In the two-step CRF model, the first CRF segments an input word into chunks and the second one converts each chunk into one unit in the target language.
    Page 1, “Abstract”
  3. In the “NEWS 2009 Machine Transliteration Shared Task”, a new two-step CRF model for transliteration task has been proposed (Yang et al., 2009), in which the first step is to segment a word in the source language into character chunks and the second step is to perform a context-dependent mapping from each chunk into one written unit in the target language.
    Page 2, “Introduction”
  4. In this paper, we propose to jointly optimize a two-step CRF model.
    Page 2, “Introduction”
  5. The rest of this paper is organized as follows: Section 2 explains the two-step CRF method, followed by Section 3 which describes our joint optimization method and its fast decoding algorithm; Section 4 introduces a rapid implementation of a J SCM system in the weighted finite state transducer (WFST) framework; and the last section reports the experimental results and conclusions.
    Page 2, “Introduction”
  6. 2.1 CRF introduction
    Page 2, “Two-step CRF method”
  7. CT. CRF training is usually performed through the L-BFGS algorithm (Wal-lach, 2002) and decoding is performed by the Viterbi algorithm.
    Page 2, “Two-step CRF method”
  8. We formalize machine transliteration as a CRF tagging problem, as shown in Figure 2.
    Page 2, “Two-step CRF method”
  9. Figure 2: An pictorial description of a CRF segmenter and a CRF converter
    Page 2, “Two-step CRF method”
  10. 2.2 CRF segmenter
    Page 2, “Two-step CRF method”
  11. In the CRF , a feature function describes a co-occurrence relation, and it is usually a binary function, taking the value 1 when both an observation and a label transition are observed.
    Page 2, “Two-step CRF method”

See all papers in Proc. ACL 2010 that mention CRF.

See all papers in Proc. ACL that mention CRF.

Back to top.

segmentations

Appears in 7 sentences as: segmentations (7)
In Jointly Optimizing a Two-Step Conditional Random Field Model for Machine Transliteration and Its Fast Decoding Algorithm
  1. The joint optimization considers all the segmentation possibilities and sums the probability over all the alternative segmentations which generate the same output.
    Page 3, “Joint optimization and its fast decoding algorithm”
  2. However, exact inference by listing all possible candidates explicitly and summing over all possible segmentations is intractable, because of the exponential computation complexity with the source word’s increasing length.
    Page 3, “Joint optimization and its fast decoding algorithm”
  3. In the segmentation step, the number of possible segmentations is 2N , where N is the length of the source word and 2 is the size of the tagging set.
    Page 3, “Joint optimization and its fast decoding algorithm”
  4. Is it really necessary to perform the second CRF for all the segmentations ?
    Page 3, “Joint optimization and its fast decoding algorithm”
  5. If we can guarantee that, even performing the 2nd CRF decoding for all the remaining segmentations Ak+1, A19”, ..., AN, the top 1 candidate does not change, then we can stop decoding.
    Page 3, “Joint optimization and its fast decoding algorithm”
  6. The CRF segmentation provides a list of segmentations : A : A1, A2, ..., AN, with conditional probabilities P(A1|S), P(A2|S), ..., P(AN|S).
    Page 5, “Conclusions and future work”
  7. If we continue performing the CRF conversion to cover all N (N 2 k) segmentations , eventually we will get:
    Page 5, “Conclusions and future work”

See all papers in Proc. ACL 2010 that mention segmentations.

See all papers in Proc. ACL that mention segmentations.

Back to top.

conditional probabilities

Appears in 4 sentences as: conditional probabilities (2) conditional probability (2)
In Jointly Optimizing a Two-Step Conditional Random Field Model for Machine Transliteration and Its Fast Decoding Algorithm
  1. From the two-step CRF model we get the conditional probability PC RF (T |S ) and from the JSCM we get the joint probability P(S, T).
    Page 4, “Experiments”
  2. The conditional probability of PJSC M (T |S) can be calculuated as follows:
    Page 4, “Experiments”
  3. The CRF segmentation provides a list of segmentations: A : A1, A2, ..., AN, with conditional probabilities P(A1|S), P(A2|S), ..., P(AN|S).
    Page 5, “Conclusions and future work”
  4. The CRF conversion, given a segmentation Ai, provides a list of transliteration output T1,T2, ...,TM, with conditional probabilities P(T1|S,Ai),P(T2|S,Az-), ...,P(TM|S,Az-).
    Page 5, “Conclusions and future work”

See all papers in Proc. ACL 2010 that mention conditional probabilities.

See all papers in Proc. ACL that mention conditional probabilities.

Back to top.