Index of papers in Proc. ACL that mention

**linear programming**

Abstract | We define a graph structure over predicates that represents entailment relations as directed edges, and use a global transitivity constraint on the graph to learn the optimal set of edges, by formulating the optimization problem as an Integer Linear Program . |

Background | In this paper we tackle a similar problem of learning a transitive relation, but we use linear programming . |

Background | A Linear Program (LP) is an optimization problem, where a linear function is minimized (or maximized) under linear constraints. |

Background | variables are integers, the problem is termed an Integer Linear Program (ILP). |

Introduction | The optimization problem is formulated as an Integer Linear Program (ILP) and solved with an ILP solver. |

Learning Entailment Graph Edges | Since we seek a global solution under transitivity and other constraints, linear programming is a natural choice, enabling the use of state of the art optimization packages. |

Learning Entailment Graph Edges | We describe two formulations of integer linear programs that learn the edges: one maximizing a global score function, and another maximizing a global probability function. |

Learning Entailment Graph Edges | Since the variables are binary, both formulations are integer linear programs with O(|V|2) variables and O(|V|3) transitivity constraints that can be solved using standard ILP packages. |

linear programming is mentioned in 14 sentences in this paper.

Topics mentioned in this paper:

- distributional similarity (15)
- WordNet (15)
- Linear Program (14)

Abstract | By reformulating the problem in the linear programming framework, TESLA—CELAB addresses several drawbacks of the character-level metrics, in particular the modeling of synonyms spanning multiple characters. |

Experiments | CELAB’s ability to detect word-level synonyms and turns TESLA-CELAB into a linear programming based character-level metric. |

Motivation | We formulate the n-gram matching process as a real-valued linear programming problem, which can be solved efficiently. |

The Algorithm | The linear programming problem is mathematically described as follows. |

The Algorithm | The linear programming solver may come up with any of the solutions where Wk, 4%) + MW. |

The Algorithm | n-gram matching in the linear programming problem itself. |

linear programming is mentioned in 9 sentences in this paper.

Topics mentioned in this paper:

- BLEU (19)
- word-level (15)
- n-grams (12)

Decomposed Amortized Inference | The goal is to solve an integer linear program q, which is defined as |

Introduction | In these problems, the inference problem has been framed as an integer linear program (ILP). |

Margin-based Amortization | Let p denote an inference problem posed as an integer linear program belonging to an equivalence class [P] with optimal solution yp. |

Margin-based Amortization | Even though the theorem provides a condition for two integer linear programs to have the same solution, checking the validity of the condition requires the computation of A, which in itself is another integer linear program . |

Problem Definition and Notation | The language of 0-1 integer linear programs (ILP) provides a convenient analytical tool for representing structured prediction problems. |

Problem Definition and Notation | Let the set P 2 {p1, p2, - - - } denote previously solved inference problems, along with their respective solutions {yllm yfj, - - - An equivalence class of integer linear programs , denoted by [P], consists of ILPs which have the same number of inference variables and the same feasible set. |

linear programming is mentioned in 6 sentences in this paper.

Topics mentioned in this paper:

- ILP (18)
- linear program (6)
- role labeling (6)

Abstract | We formulate the problem of non-projective dependency parsing as a polynomial-sized integer linear program . |

Abstract | The model parameters are learned in a max-margin framework by employing a linear programming relaxation. |

Dependency Parsing as an ILP | A linear program (LP) is an optimization problem of the form |

Dependency Parsing as an ILP | If we add the constraint x E Zd, then the above is called an integer linear program (ILP). |

Introduction | Much attention has recently been devoted to integer linear programming (ILP) formulations of NLP problems, with interesting results in applications like semantic role labeling (Roth and Yih, 2005; Punyakanok et al., 2004), dependency parsing (Riedel and Clarke, 2006), word alignment for machine translation (Lacoste-Julien et al., 2006), summarization (Clarke and Lapata, 2008), and coreference resolution (Denis and Baldridge, 2007), among others. |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- dependency parsing (21)
- ILP (19)
- dependency tree (14)

Abstract | Experiments show that these approximation strategies produce results comparable to a state-of-the-art integer linear programming formulation for the same joint inference task along with a significant improvement in runtime. |

Conclusion | Experiments show that one of these approximation strategies produces results comparable to a state-of-the-art integer linear program for the same joint inference task with a 60% reduction in average inference time. |

Introduction | Joint methods have also been proposed that invoke integer linear programming (ILP) formulations to simultaneously consider multiple structural inference problems—both over n-grams and input dependencies (Martins and Smith, 2009) or n-grams and all possible dependencies (Thadani and McKeown, 2013). |

Introduction | We therefore consider methods to recover approximate solutions for the subproblem of finding the maximum weighted subtree in a graph, common among which is the use of a linear programming relaxation. |

Introduction | This linear program (LP) appears empirically tight for compression problems and our experiments indicate that simply using the non-integral solutions of this LP in Lagrangian relaxation can empirically lead to reasonable compressions. |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- ILP (19)
- bigram (13)
- spanning tree (9)

Abstract | We use integer linear programming (ILP) to model the inference process, which can easily incorporate both the power of existing error classifiers and prior knowledge on grammatical error correction. |

Conclusion | The inference problem is solved using integer linear programming . |

Inference with First Order Variables | The inference problem for grammatical error correction can be stated as follows: “Given an input sentence, choose a set of corrections which results in the best output sentence.” In this paper, this problem will be expressed and solved by integer linear programming (ILP). |

Inference with First Order Variables | The ILP problem is solved using lp_solve1, an integer linear programming solver based on the revised simplex method and the branch—and—bound method for integers. |

Related Work | Integer linear programming has been successfully applied to many NLP tasks, such as dependency parsing (Riedel and Clarke, 2006; Martins et al., 2009), semantic role labeling (Punyakanok et al., 2005), and event extraction (Riedel and Mc—Callum, 2011). |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- ILP (33)
- shared task (11)
- objective function (9)

Abstract | We also provide a computational model for automatically annotating text using this coding scheme, using supervised learning enhanced by constraints implemented with Integer Linear Programming . |

Background | In section 4.3 we formalize these constraints using Integer Linear Programming . |

Background | 4.3 Constraints using Integer Linear Programming |

Background | We formulate our constraints using Integer Linear Programming (ILP). |

Introduction | These constraints are formulated as boolean statements describing what a correct label sequence looks like, and are imposed on our model using an Integer Linear Programming formulation (Roth and Yih, 2004). |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- ILP (8)
- discourse structure (7)
- bag-of-words (6)

A Simple Lagrangian Relaxation Algorithm | There are close connections between Lagrangian relaxation and linear programming relaxations. |

Experiments | LR = Lagrangian relaxation; DP = exhaustive dynamic programming; ILP = integer linear programming; LP = linear programming (LP does not recover an exact solution). |

Experiments | Figure 5 gives information on decoding time for our method and two other exact decoding methods: integer linear programming (using constraints D0—D6), and exhaustive dynamic programming using the construction of (Bar-Hillel et al., 1964). |

Experiments | Figure 5 also gives a speed comparison of our method to a linear programming (LP) solver that solves the LP relaxation defined by constraints D0—D6. |

Introduction | The dual corresponds to a particular linear programming (LP) relaxation of the original decoding problem. |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- language model (19)
- dynamic programming (14)
- bigram (9)

Connotation Induction Algorithms | We develop induction algorithms based on three distinct types of algorithmic framework that have been shown successful for the analogous task of sentiment lexicon induction: HITS & PageRank (§2.1), Label/Graph Propagation (§2.2), and Constraint Optimization via Integer Linear Programming (§2.3). |

Connotation Induction Algorithms | Addressing limitations of graph-based algorithms (§2.2), we propose an induction algorithm based on Integer Linear Programming (ILP). |

Precision, Coverage, and Efficiency | We therefore explore an alternative approach based on Linear Programming in what follows. |

Precision, Coverage, and Efficiency | 4.1 Induction using Linear Programming |

Precision, Coverage, and Efficiency | One straightforward option for Linear Programming formulation may seem like using the same Integer Linear Programming formulation introduced in §2.3, only changing the variable definitions to be real values 6 [0, 1] rather than integers. |

linear programming is mentioned in 5 sentences in this paper.

Topics mentioned in this paper:

- ILP (13)
- sentiment lexicon (11)
- Turkers (9)

Abstract | We augment the standard perceptron algorithm with a global integer linear programming formulation to optimize both local fit of information into each topic and global coherence across the entire overview. |

Introduction | We estimate the parameters of our model using the perceptron algorithm augmented with an integer linear programming (ILP) formulation, run over a training set of example articles in the given domain. |

Method | To select the optimal excerpts, we employ integer linear programming (ILP). |

Method | Solving the ILP Solving an integer linear program is NP-hard (Cormen et al., 1992); however, in practice there exist several strategies for solving certain ILPs efficiently. |

linear programming is mentioned in 4 sentences in this paper.

Topics mentioned in this paper:

- ILP (9)
- perceptron (7)
- cosine similarity (5)

Abstract | Our method is based on a probabilistic model that feeds weights into integer linear programs that leverage type signatures of relational phrases and type correlation or disj ointness constraints. |

Candidate Types for Entities | 4.3 Integer Linear Program Formulation |

Candidate Types for Entities | Our solution is formalized as an Integer Linear Program (ILP). |

Introduction | For cleaning out false hypotheses among the type candidates for a new entity, we devised probabilistic models and an integer linear program that considers incompatibilities and correlations among entity types. |

linear programming is mentioned in 4 sentences in this paper.

Topics mentioned in this paper:

- ILP (18)
- fine-grained (7)
- knowledge base (7)

Experiments | First, we run a relaxed linear programming (LP) parser, then project the (possibly fractional) parses back to the feasible region. |

Introduction | At each node, our relaxation derives a linear programming problem (LP) that can be efficiently solved by the dual simplex method. |

Related Work | Several integer linear programming (ILP) formulations of dependency parsing (Riedel and Clarke, 2006; Martins et al., 2009; Riedel et al., 2012) inspired our definition of grammar induction as a MP. |

Relaxations | We replace our objective 2m 6m fm with 2m zm, where we would like to constrain each auxiliary variable zm to be 2 mem or (equivalently) g mem, but instead settle for making it g the concave envelope—a linear programming problem: |

linear programming is mentioned in 4 sentences in this paper.

Topics mentioned in this paper:

- Viterbi (24)
- model parameters (22)
- grammar induction (7)

Computing the Metric | We can get an optimal solution using integer linear programming (ILP). |

Introduction | We investigate how to compute this metric and provide several practical and replicable computing methods by using Integer Linear Programming (ILP) and hill-climbing method. |

Using Smatch | 0 ILP: Integer Linear Programming |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- ILP (9)
- semantic parsing (7)
- evaluation metric (3)

Abstract | However, they are better applied to a word-based model, thus an integer linear programming (ILP) formulation is proposed. |

Abstract | In recent work, interesting results are reported for applications of integer linear programming (ILP) such as semantic role labeling (SRL) (Roth and Yih, 2005), dependency parsing (Martins et al., 2009) and so on. |

Abstract | We propose an Integer Linear Programming (ILP) formulation of word segmentation, which is naturally viewed as a word-based model for CWS. |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- POS tagging (31)
- ILP (20)
- Viterbi (18)

Background | proved that this optimization problem, which we term Max-Trans-Graph, is NP-hard, and so described it as an Integer Linear Program (ILP). |

Background | In LP relaxation, the constraint 307;]- E {0, 1} is replaced by 0 S 307;]- S 1, transforming the problem from an ILP to a Linear Program (LP), which is polynomial. |

Introduction | Since finding the optimal set of edges respecting transitivity is NP-hard, they employed Integer Linear Programming (ILP) to find the exact solution. |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- ILP (18)
- objective function (5)
- distributional similarity (3)

Abstract | Using an integer linear programming formulation, the model learns to select and combine phrases subject to length, coverage and grammar constraints. |

Conclusions | Grammaticality, length and coverage requirements are encoded as constraints in an integer linear program . |

Introduction | We encode these constraints through the use of integer linear programming (ILP), a well-studied optimization framework that is able to search the entire solution space efficiently. |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- ILP (33)
- sentence compression (8)
- phrase-based (7)

Background | Formally, from equation 3, the most probable interpretation, is the one that minimizes 27.61% Ar(d(7“))p. In case of p = 1, and given that all d (7“) are linear equations, then minimizing the sum requires solving a linear program , which, compared to inference in other probabilistic logics such as MLNs, can be done relatively efficiently using well-established techniques. |

Introduction | On the other hand, inference in PSL reduces to a linear programming problem, which is theoretically and practically much more efficient. |

PSL for STS | PSL’s inference is actually an iterative process where in each iteration a grounding phase is followed by an optimization phase (solving the linear program ). |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- logical forms (6)
- semantic similarity (6)
- similarity score (6)

Adding Back Constraints | In general, it can be shown that Lagrangian relaxation is only guaranteed to solve a linear programming relaxation of the underlying combinatorial problem. |

Related Work | General linear programming approaches have also been applied to word alignment problems. |

Related Work | (2006) formulate the word alignment problem as quadratic assignment problem and solve it using an integer linear programming solver. |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- word alignment (15)
- Viterbi (14)
- sentence pairs (11)

Abstract | We combine this knowledge with discourse information, and formulate the debate side classification task as an Integer Linear Programming problem. |

Introduction | This information is employed, in conjunction with discourse information, in an Integer Linear Programming (ILP) framework. |

Method | We formulate the problem of finding the overall side of the post as an Integer Linear Programming (ILP) problem. |

linear programming is mentioned in 3 sentences in this paper.

Topics mentioned in this paper:

- F-measure (4)
- Linear Programming (3)
- Semantic Relatedness (3)