Abstract | By reformulating the problem in the linear programming framework, TESLA—CELAB addresses several drawbacks of the character-level metrics, in particular the modeling of synonyms spanning multiple characters. |
Experiments | CELAB’s ability to detect word-level synonyms and turns TESLA-CELAB into a linear programming based character-level metric. |
Motivation | We formulate the n-gram matching process as a real-valued linear programming problem, which can be solved efficiently. |
The Algorithm | The linear programming problem is mathematically described as follows. |
The Algorithm | The linear programming solver may come up with any of the solutions where Wk, 4%) + MW. |
The Algorithm | n-gram matching in the linear programming problem itself. |
Background | proved that this optimization problem, which we term Max-Trans-Graph, is NP-hard, and so described it as an Integer Linear Program (ILP). |
Background | In LP relaxation, the constraint 307;]- E {0, 1} is replaced by 0 S 307;]- S 1, transforming the problem from an ILP to a Linear Program (LP), which is polynomial. |
Introduction | Since finding the optimal set of edges respecting transitivity is NP-hard, they employed Integer Linear Programming (ILP) to find the exact solution. |
Abstract | However, they are better applied to a word-based model, thus an integer linear programming (ILP) formulation is proposed. |
Abstract | In recent work, interesting results are reported for applications of integer linear programming (ILP) such as semantic role labeling (SRL) (Roth and Yih, 2005), dependency parsing (Martins et al., 2009) and so on. |
Abstract | We propose an Integer Linear Programming (ILP) formulation of word segmentation, which is naturally viewed as a word-based model for CWS. |