Index of papers in Proc. ACL that mention
  • soft constraints
Anzaroot, Sam and Passos, Alexandre and Belanger, David and McCallum, Andrew
Abstract
Previous work has shown that modeling soft constraints , where the model is encouraged, but not require to obey the constraints, can substantially improve segmentation performance.
Abstract
We extend dual decomposition to perform prediction subject to soft constraints .
Abstract
Moreover, with a technique for performing inference given soft constraints , it is easy to automatically generate large families of constraints and learn their costs with a simple convex optimization problem during training.
Background
The overall modeling technique we employ is to add soft constraints to a simple model for which we have an existing efficient prediction algorithm.
Introduction
On the other hand, recent work has demonstrated improvements in citation field extraction by imposing soft constraints (Chang et al., 2012).
Introduction
This paper introduces a novel method for imposing soft constraints via dual decomposition.
Introduction
We also propose a method for learning the penalties the prediction problem incurs for violating these soft constraints .
Soft Constraints in Dual Decomposition
We now introduce an extension of Algorithm 1 to handle soft constraints .
Soft Constraints in Dual Decomposition
Note that when performing MAP subject to soft constraints , optimal solutions might not satisfy some constraints, since doing so would reduce the model’s score by too much.
Soft Constraints in Dual Decomposition
In other words, the dual problem can not penalize the violation of a constraint more than the soft constraint model in the primal would penalize you if you violated it.
soft constraints is mentioned in 25 sentences in this paper.
Topics mentioned in this paper:
Chen, Yanping and Zheng, Qinghua and Zhang, Wei
Abstract
In this paper, we propose an Omni—word feature and a soft constraint method for Chinese relation extraction.
Abstract
In order to utilize the structure information of a relation instance, we discuss how soft constraint can be used to capture the local dependency.
Abstract
Both Omni-word feature and soft constraint make a better use of sentence information and minimize the influences caused by Chinese word segmentation and parsing.
Feature Construction
The soft constraint is the
Feature Construction
1If without ambiguity, we also use the terminology of “soft constraint” denoting features generated by the employed constraint conditions.
Feature Construction
3.2 Soft Constraint
Introduction
and Soft Constraint
Introduction
Based on the characteristics of Chinese, in this paper, an Omni-word feature and a soft constraint method are proposed for Chinese relation extraction.
Introduction
Aiming at the Chinese inattentive structure, we utilize the soft constraint to capture the local dependency in a relation instance.
Related Work
The soft constraints , proposed in this paper, are combined features like these syntactic or semantic constraints, which will be discussed in Section 3.2.
Related Work
In our research, we proposed an Omni-word feature and a soft constraint method.
soft constraints is mentioned in 19 sentences in this paper.
Topics mentioned in this paper:
Li, Junhui and Marton, Yuval and Resnik, Philip and Daumé III, Hal
Abstract
We develop novel features based on both models and use them as soft constraints to guide the translation process.
Experiments
Our stronger baseline employs, in addition, the fine-grained syntactic soft constraint features of Marton and Resnik (2008), hereafter MR08.
Experiments
The syntactic soft constraint features include both MR08 exact-matching and cross-boundary constraints (denoted XP= and XP+).
Experiments
This suggests that our syntactic reordering features interact well with the MR08 syntactic soft constraints : the XP+ and XP= features focus on a single constituent each, while our reordering features focus on a pair of constituents each.
Related Work
Another approach in previous work added soft constraints as weighted features in the SMT decoder to reward good reorderings and penalize bad ones.
Related Work
akis and Sima’an, 2011), the rules are sparser than SCFG with nameless non-terminals (i.e., Xs) and soft constraints .
Related Work
In the soft constraint or reordering model approach, Liu and Gildea (2010) modeled the reordering/deletion of source-side semantic roles in a tree-to-string translation model.
soft constraints is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Cherry, Colin
Abstract
To further increase flexibility, we incorporate cohesion as a decoder feature, creating a soft constraint .
Cohesive Decoding
3.2 Soft Constraint
Conclusion
Our soft constraint produced improvements ranging between 0.5 and 1.1 BLEU points on sentences for which the baseline produces uncohesive translations.
Discussion
The cohesive system, even with a soft constraint , cannot reproduce the same movement, and returns a less grammatical translation.
Experiments
ilarly, with the soft constraint providing more stable, and generally better results.
Experiments
We confirmed these trends on our test set, but to conserve space, we provide detailed results for only the soft constraint .
soft constraints is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Yang, Bishan and Cardie, Claire
Approach
Therefore all the constraints are applied as soft constraints .
Approach
The differences are: (l) we encode the coreference relations as soft constraints during learning instead of applying them as hard constraints during inference time; (2) our constraints can apply to both polar and non-polar sentences; (3) our identification of coreference relations is automatic without any fine-grained annotations for opinion targets.
Introduction
In this paper, we propose a sentence-level sentiment classification method that can (1) incorporate rich discourse information at both local and global levels; (2) encode discourse knowledge as soft constraints during learning; (3) make use of unlabeled data to enhance learning.
Introduction
Specifically, we use the Conditional Random Field (CRF) model as the learner for sentence-level sentiment classification, and incorporate rich discourse and lexical knowledge as soft constraints into the learning of CRF parameters via Posterior Regularization (PR) (Ganchev et al., 2010).
Related Work
(2013) explored the use of explanatory discourse relations as soft constraints in a Markov Logic Network framework for extracting subjective text segments.
Related Work
It has the advantages of utilizing rich discourse knowledge at different levels of context and encoding it as soft constraints during learning.
soft constraints is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Zhang, Duo and Mei, Qiaozhu and Zhai, ChengXiang
Abstract
Specifically, we propose a new topic model called Probabilistic Cross-Lingual Latent Semantic Analysis (PCLSA) which extends the Probabilistic Latent Semantic Analysis (PLSA) model by regularizing its likelihood function with soft constraints defined based on a bilingual dictionary.
Introduction
PCLSA extends the Probabilistic Latent Semantic Analysis (PLSA) model by regularizing its likelihood function with soft constraints defined based on a bilingual dictionary.
Introduction
In our model, since we only add a soft constraint on word pairs in the dictionary, their probabilities in common topics are generally different, naturally capturing which shows the different variations of a common topic in different languages.
Probabilistic Cross-Lingual Latent Semantic Analysis
We achieve this by adding such preferences formally to the likelihood function of a probabilistic topic model as “soft constraints” so that when we estimate the model, we would try to not only fit the text data well (which is necessary to extract coherent component topics from each language), but also satisfy our specified preferences (which would ensure the extracted component topics in different languages are semantically related).
Related Work
corporating the knowledge of a bilingual dictionary as soft constraints .
soft constraints is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Feng, Minwei and Peter, Jan-Thorsten and Ney, Hermann
Comparative Study
Similar to the previous model, the SRL information is used as soft constraints .
Comparative Study
In Section 3.6 of (Zhang, 2013), instead of doing hard reordering decision, the author uses the rules as soft constraints in the decoder.
Conclusion
The model is utilized as soft constraints in the decoder.
Introduction
(Feng et al., 2012) present a method that utilizes predicate-argument structures from semantic role labeling results as soft constraints .
soft constraints is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Li, Qi and Ji, Heng
Abstract
In addition, by virtue of the inexact search, we developed a number of new and effective global features as soft constraints to capture the interdependency among entity mentions and relations.
Features
Relation arcs can also share inter-dependencies or obey soft constraints .
Introduction
We design a set of novel global features based on soft constraints over the entire output graph structure with low cost (Section 4).
Related Work
As a key difference, our approach jointly extracts entity mentions and relations using a single model, in which arbitrary soft constraints can be easily incorporated.
soft constraints is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Martins, Andre and Smith, Noah and Xing, Eric
Abstract
Our formulation is able to handle nonlocal output features in an efficient manner; not only is it compatible with prior knowledge encoded as hard constraints, it can also learn soft constraints from data.
Conclusions
These features can act as soft constraints whose penalty values are automatically learned from data; in addition, our model is also compatible with expert knowledge in the form of hard constraints.
Introduction
0 Soft constraints may be automatically learned from data.
soft constraints is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Lazaridou, Angeliki and Titov, Ivan and Sporleder, Caroline
Introduction
Without this property we would not be able to encode soft constraints imposed by the discourse relations.
Introduction
These are only soft constraints that have to be taken into consideration along with the information provided by the aspect-sentiment model.
Introduction
They use an integer linear programming framework to enforce agreement between classifiers and soft constraints provided by discourse annotations.
soft constraints is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: