Index of papers in Proc. ACL 2014 that mention
  • optimization problem
Anzaroot, Sam and Passos, Alexandre and Belanger, David and McCallum, Andrew
Abstract
Moreover, with a technique for performing inference given soft constraints, it is easy to automatically generate large families of constraints and learn their costs with a simple convex optimization problem during training.
Background
The MAP inference task in a CRF be can expressed as an optimization problem with a lin-
Background
Furthermore, a subgradient of D(A) is Ay* — b, for an y* which maximizes this inner optimization problem .
Soft Constraints in Dual Decomposition
Consider the optimization problems of the form: max.
Soft Constraints in Dual Decomposition
This optimization problem can still be solved with projected subgradient descent and is depicted in Algorithm 2.
Soft Constraints in Dual Decomposition
Each penalty Ci has to be nonnegative; otherwise, the optimization problem in equation (5) is ill-defined.
optimization problem is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Cohen, Shay B. and Collins, Michael
Introduction
We show that once the matrix decomposition step has been applied, parameter estimation of the L—PCFG can be reduced to a convex optimization problem that is easily solved by EM.
The Learning Algorithm for L-PCFGS
Crucially, this is a convex optimization problem , and the EM algorithm will converge to the global maximum of this likelihood function.
The Learning Algorithm for L-PCFGS
Now consider the optimization problem in Eq.
The Learning Algorithm for L-PCFGS
al., 2012) gives one set of guarantees; the remaining optimization problems we solve are convex maximum-likelihood problems, which are also relatively easy to analyze.
The Matrix Decomposition Algorithm
This is again a convex optimization problem .
optimization problem is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Chen, Liwei and Feng, Yansong and Huang, Songfang and Qin, Yong and Zhao, Dongyan
Introduction
We formalize this procedure as a constrained optimization problem , which can be solved by many optimization frameworks.
The Framework
This is an NP-hard optimization problem .
The Framework
After the optimization problem is solved, we will obtain a list of selected candidate relations for each tuple, which will be our final output.
optimization problem is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Flanigan, Jeffrey and Thomson, Sam and Carbonell, Jaime and Dyer, Chris and Smith, Noah A.
Abstract
The method is based on a novel algorithm for finding a maximum spanning, connected subgraph, embedded within a Lagrangian relaxation of an optimization problem that imposes linguistically inspired constraints.
Related Work
tensions allow for higher-order (non—edge—local) features, often making use of relaxations to solve the NP-hard optimization problem .
Relation Identification
We frame the task as a constrained combinatorial optimization problem .
optimization problem is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Parikh, Ankur P. and Cohen, Shay B. and Xing, Eric P.
Abstract
Unfortunately, finding the global maximum for these objective functions is usually intractable (Cohen and Smith, 2012) which often leads to severe local optima problems (but see Gormley and Eisner, 2013).
Abstract
Directly attempting to maximize the likelihood unfortunately results in an intractable optimization problem and greedy heuristics are often employed (Harmeling and Williams, 2011).
Abstract
Therefore, the procedure to find a bracketing for a given POS tag a: is to first estimate the distance matrix sub-block fiww from raw text data (see §3.4), and then solve the optimization problem arg minueu using a variant of the Eisner-Satta algorithm where is identical to 00.0 in Eq.
optimization problem is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: