Abstract | We investigate different ways of combining the inflection prediction component with the SMT system by training the base MT system on fully inflected forms or on word stems. |
Inflection prediction models | The dependency structure on the Russian side, indicated by solid arcs, is given by a treelet MT system (see Section 4.1), projected from the word dependency struc- |
Integration of inflection models with MT systems | The methods differ in the extent to which the factoring of the problem into two subprob-lems — predicting stems and predicting inflections — is reflected in the base MT systems . |
Integration of inflection models with MT systems | In the first method, the MT system is trained to produce fully inflected target words and the inflection model can change the inflections. |
Introduction | (Finkel et al., 2006), and in some cases, to factor the translation problem so that the baseline MT system can take advantage of the reduction in sparsity by being able to work on word stems. |
Machine translation systems and data | This is a syntactically-informed MT system , designed following (Quirk et al., 2005). |
Machine translation systems and data | For each language pair, we used a set of parallel sentences (train) for training the MT system sub-models (e.g., phrase tables, language model), a set of parallel sentences (lambda) for training the combination weights with max-BLEU training, a set of parallel sentences (dev) for training a small number of combination parameters for our integration methods (see Section 5), and a set of parallel sentences (test) for final evaluation. |
Machine translation systems and data | All MT systems for a given language pair used the same datasets. |
Related work | In recent work, Koehn and Hoang (2007) proposed a general framework for including morphological features in a phrase-based SMT system by factoring the representation of words into a vector of morphological features and allowing a phrase-based MT system to work on any of the factored representations, which is implemented in the Moses system. |
Related work | This also makes the model portable and applicable to different types of MT systems . |
Related work | In contrast, we focus on methods of integration of an inflection prediction model with an MT system , and on evaluation of the model’s impact on translation. |
Abstract | We propose and extensively evaluate a simple method for using alignment models to produce alignments better-suited for phrase-based MT systems , and show significant gains (as measured by BLEU score) in end-to-end translation systems for six languages pairs used in recent MT competitions. |
Adding agreement constraints | Most MT systems train an alignment model in each direction and then heuristically combine their predictions. |
Conclusions | The nature of the complicated relationship between word alignments, the corresponding extracted phrases and the effects on the final MT system still begs for better explanations and metrics. |
Introduction | used, we can get not only improvements in alignment performance, but also in the performance of the MT system that uses those alignments. |
Discussion | This dependency LM can also be used in hierarchical MT systems using lexical-ized CFG trees. |
Experiments | 0 filtered: a string-to-string MT system as in baseline. |
Introduction | In section 4, we describe the implementation details of our MT system . |