Index of papers in Proc. ACL 2014 that mention
  • word-level
Kang, Jun Seok and Feng, Song and Akoglu, Leman and Choi, Yejin
Evaluation 11: Human Evaluation on ConnotationWordNet
We collect two separate sets of labels: a set of labels at the word-level , and another set at the sense-level.
Evaluation 11: Human Evaluation on ConnotationWordNet
For word-level labels we apply similar procedure as above.
Evaluation 11: Human Evaluation on ConnotationWordNet
Lexicon Word-level Sense-level SentiWordNet 27.22 14.29 OpinionFinder 3 1 .95 -Feng2013 62.72 -GWORD+SENSE(95%) 84.91 83.43 GWORD+SENSE(99%) 84.91 83.71 E-GWORD+SENSE(95%) 86.98 86.29 E-GWORD+SENSE(99%) 86.69 85.71
Introduction
For non-polysemous words, which constitute a significant portion of English vocabulary, learning the general connotation at the word-level (rather than at the sense-level) would be a natural operational choice.
Introduction
As a result, researchers often would need to aggregate labels across different senses to derive the word-level label.
Introduction
Therefore, in this work, we present the first unified approach that learns both sense- and word-level connotations simultaneously.
Pairwise Markov Random Fields and Loopy Belief Propagation
We formulate the task of learning sense- and word-level connotation lexicon as a graph-based classification task (Sen et al., 2008).
word-level is mentioned in 15 sentences in this paper.
Topics mentioned in this paper:
Shen, Mo and Liu, Hongxiao and Kawahara, Daisuke and Kurohashi, Sadao
Abstract
We propose a method that performs character-level POS tagging jointly with word segmentation and word-level POS tagging.
Character-level POS Tagset
Some of these tags are directly derived from the commonly accepted word-level part-of-speech, such as noun, verb, adjective and adverb.
Chinese Morphological Analysis with Character-level POS
This hybrid model constructs a lattice that consists of word-level and character-level nodes from a given input sentence.
Chinese Morphological Analysis with Character-level POS
Word-level nodes correspond to words found in the system’s lexicon, which has been compiled from training data.
Chinese Morphological Analysis with Character-level POS
upper part of the lattice (word-level nodes) represents known words, where each node carries information such as character form, character-level POS , and word-level POS.
Introduction
Table l. Character-level POS sequence as a more specified version of word-level POS: an example of verb.
Introduction
Another advantage of character-level P08 is that, the sequence of character-level P08 in a word can be seen as a more fine-grained version of word-level POS.
Introduction
The five words in this table are very likely to be tagged with the same word-level POS as verb in any available annotated corpora, while it can be commonly agreed among native speakers of Chinese that the syntactic behaviors of these words are different from each other, due to their distinctions in word constructions.
word-level is mentioned in 17 sentences in this paper.
Topics mentioned in this paper:
Zhang, Meishan and Zhang, Yue and Che, Wanxiang and Liu, Ting
Abstract
Character-level information can benefit downstream applications by offering flexible granularities for word segmentation while improving word-level dependency parsing accuracies.
Character-Level Dependency Tree
Inner-word dependencies can also bring benefits to parsing word-level dependencies.
Character-Level Dependency Tree
When the internal structures of words are annotated, character-level dependency parsing can be treated as a special case of word-level dependency parsing, with “words” being “characters”.
Character-Level Dependency Tree
The word-level dependency parsing features are added when the inter-word actions are applied, and the features for joint word segmentation and POS-tagging are added when the actions PW, SHW and SHC are applied.
Introduction
Moreover, manually annotated intra-word dependencies can give improved word-level dependency accuracies than pseudo intra-word dependencies.
word-level is mentioned in 16 sentences in this paper.
Topics mentioned in this paper:
Farra, Noura and Tomeh, Nadi and Rozovskaya, Alla and Habash, Nizar
Abstract
While operating at the character level, the model makes use of word-level and contextual information.
Conclusions
In the future, we plan to extend the model to use word-level language models to select between top character predictions in the output.
Experiments
The word-error-rate WER metric is computed by summing the total number of word-level substitution errors, insertion errors, and deletion errors in the output, and dividing by the number of words in the reference.
Related Work
Discriminative models have been proposed at the word-level for error correction (Duan et al., 2012) and for error detection (Habash and Roth, 2011).
The GSEC Approach
We implemented another approach for error correction based on a word-level maximum likelihood model.
word-level is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Hu, Yuening and Zhai, Ke and Eidelman, Vladimir and Boyd-Graber, Jordan
Polylingual Tree-based Topic Models
In this section, we bring existing tree-based topic models (Boyd-Graber et al., 2007, tLDA) and polylingual topic models (Mimno et al., 2009, pLDA) together and create the polylingual tree-based topic model (ptLDA) that incorporates both word-level correlations and document-level alignment information.
Polylingual Tree-based Topic Models
Word-level Correlations Tree-based topic models incorporate the correlations between words by
Polylingual Tree-based Topic Models
Build Prior Tree Structures One remaining question is the source of the word-level connections across languages for the tree prior.
word-level is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Lei, Tao and Xin, Yu and Zhang, Yuan and Barzilay, Regina and Jaakkola, Tommi
Related Work
Nevertheless, any such word-level representation can be used to offset inherent sparsity problems associated with full lexi-calization (Cirik and Sensoy, 2013).
Related Work
Word-level vector space embeddings have so far had limited impact on parsing performance.
Related Work
While this method learns to map word combinations into vectors, it builds on existing word-level vector representations.
word-level is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Salameh, Mohammad and Cherry, Colin and Kondrak, Grzegorz
Methods
In this section, we discuss how a lattice from a multi-stack phrase-based decoder such as Moses (Koehn et al., 2007) can be desegmented to enable word-level features.
Methods
We now have a desegmented lattice, but it has not been annotated with an unsegmented ( word-level ) language model.
Methods
Indeed, the expanded word-level context is one of the main benefits of incorporating a word-level LM.
word-level is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: