Index of papers in Proc. ACL 2014 that mention
  • graph model
Jia, Zhongye and Zhao, Hai
Abstract
In this paper, motivated by a key equivalence of two decoding algorithms, we propose a joint graph model to globally optimize PTC and typo correction for IME.
Conclusion
In this paper, we have developed a joint graph model for pinyin-to-Chinese conversion with typo correction.
Pinyin Input Method Model
Inspired by (Yang et al., 2012b) and (Jia et al., 2013), we adopt the graph model for Chinese spell checking for pinyin segmentation and typo correction, which is based on the shortest path word segmentation algorithm (Casey and Lecolinet, 1996).
Pinyin Input Method Model
Figure 2: Graph model for pinyin segmentation
Pinyin Input Method Model
Figure 3: Graph model for pinyin typo correction
Related Works
Various approaches were made for the task including language model (LM) based methods (Chen et al., 2013), ME model (Han and Chang, 2013), CRF (Wang et al., 2013d; Wang et al., 2013a), SMT (Chiu et al., 2013; Liu et al., 2013), and graph model (Jia et al., 2013), etc.
graph model is mentioned in 11 sentences in this paper.
Topics mentioned in this paper:
Zou, Bowei and Zhou, Guodong and Zhu, Qiaoming
Abstract
In this paper, we propose a graph model to enrich intra-sentence features with inter-sentence features from both lexical and topic perspectives.
Abstract
Evaluation on the *SEM 2012 shared task corpus indicates the usefulness of contextual discourse information in negation focus identification and justifies the effectiveness of our graph model in capturing such global information.
Baselines
In this paper, we first propose a graph model to gauge the importance of contextual discourse
Baselines
4.1 Graph Model
Baselines
Graph models have been proven successful in many NLP applications, especially in representing the link relationships between words or sentences (Wan and Yang, 2008; Li et al., 2009).
Introduction
In this paper, to well accommodate such contextual discourse information in negation focus identification, we propose a graph model to enrich normal intra—sentence features with various kinds of inter-sentence features from both lexical and topic perspectives.
Introduction
Besides, the standard PageRank algorithm is employed to optimize the graph model .
Introduction
Section 4 introduces our topic-driven word-based graph model with contextual discourse information.
graph model is mentioned in 35 sentences in this paper.
Topics mentioned in this paper:
Beltagy, Islam and Erk, Katrin and Mooney, Raymond
Background
Markov Logic Networks (MLN) (Richardson and Domingos, 2006) are a framework for probabilistic logic that employ weighted formulas in first-order logic to compactly encode complex undirected probabilistic graphical models (i.e., Markov networks).
Background
It uses logical representations to compactly define large graphical models with continuous variables, and includes methods for performing efficient probabilistic inference for the resulting models.
Background
Given a set of weighted logical formulas, PSL builds a graphical model defining a probability distribution over the continuous space of values of the random variables in the model.
PSL for STS
Grounding is the process of instantiating the variables in the quantified rules with concrete constants in order to construct the nodes and links in the final graphical model .
graph model is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Parikh, Ankur P. and Cohen, Shay B. and Xing, Eric P.
Abstract
We associate each sentence with an undirected latent tree graphical model , which is a tree consisting of both observed variables (corresponding to the words in the sentence) and an additional set of latent variables that are unobserved in the data.
Abstract
Unlike in phylogenetics and graphical models , where a single latent tree is constructed for all the data, in our case, each part of speech sequence is associated with its own parse tree.
Abstract
Following this intuition, we propose to model the distribution over the latent bracketing states and words for each tag sequence a: as a latent tree graphical model , which encodes conditional inde-pendences among the words given the latent states.
graph model is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Anzaroot, Sam and Passos, Alexandre and Belanger, David and McCallum, Andrew
Background
Here, we define a binary indicator variable for each candidate setting of each factor in the graphical model .
Citation Extraction Data
There are multiple previous examples of augmenting chain-structured sequence models with terms capturing global relationships by expanding the chain to a more complex graphical model with nonlocal dependencies between the outputs.
Citation Extraction Data
Soft constraints can be implemented inefficiently using hard constraints and dual decomposition— by introducing copies of output variables and an auxiliary graphical model , as in Rush et al.
graph model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Hovy, Dirk
Conclusion
Type candidates are collected from patterns and modeled as hidden variables in graphical models .
Extending the Model
We can thus move from a sequential model to a general graphical model by adding transitions and rearranging the structure.
Results
Moving from the HMMs to a general graphical model structure (Figures 3c and d) creates a sparser distribution and significantly improves accuracy across the board.
graph model is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: