Index of papers in Proc. ACL 2008 that mention
  • co-occurrence
Li, Zhifei and Yarowsky, David
Related Work
Moreover, the HMM model is computationally-expensive and unable to exploit the data co-occurrence phenomena that we
Unsupervised Translation Induction for Chinese Abbreviations
3.3.1 Data Co-occurrence
Unsupervised Translation Induction for Chinese Abbreviations
In a monolingual corpus, relevant words tend to appear together (i.e., co-occurrence ).
Unsupervised Translation Induction for Chinese Abbreviations
The co-occurrence may imply a relationship (e.g., Bill Gates is the founder of Microsoft).
co-occurrence is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Li, Jianguo and Brew, Chris
Integration of Syntactic and Lexical Information
Co-occurrence (CO): CO features mostly convey lexical information only and are generally considered not particularly sensitive to argument structures (Rohde et al., 2004).
Integration of Syntactic and Lexical Information
Adapted co-occurrence (ACO): Conventional CO features generally adopt a stop list to filter out function words.
Results and Discussion
On the other hand, the co-occurrence feature (CO), which is believed to convey only lexical information, outperforms SCF on every n-way classification when n 2 10, suggesting that verbs in the same Levin classes tend to share their neighboring words.
Results and Discussion
In fact, even the simple co-occurrence feature (CO) yields a better performance (42.4%) than these Levin-selected SCF sets.
co-occurrence is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Cao, Guihong and Robertson, Stephen and Nie, Jian-Yun
Regression Model for Alteration Selection
For example, for the query “controlling acid rain”, the coherence of the alteration “acidic” is measured by the logarithm of its co-occurrence with the other query terms within a predefined window (90 words) in the corpus.
Regression Model for Alteration Selection
where P(controlling...acidic...rain|window) is the co-occurrence probability of the trigram containing acidic within a predefined window (50 words).
Regression Model for Alteration Selection
On the other hand, the second feature helps because it can capture some co-occurrence information no matter how long the query is.
co-occurrence is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Mitchell, Jeff and Lapata, Mirella
Composition Models
We formulate semantic composition as a function of two vectors, u and v. We assume that individual words are represented by vectors acquired from a corpus following any of the parametrisa-tions that have been suggested in the literature.1 We briefly note here that a word’s vector typically represents its co-occurrence with neighboring words.
Composition Models
The construction of the semantic space depends on the definition of linguistic context (e.g., neighbour-ing words can be documents or collocations), the number of components used (e.g., the k most frequent words in a corpus), and their values (e.g., as raw co-occurrence frequencies or ratios of probabilities).
Composition Models
Here, the space has only five dimensions, and the matrix cells denote the co-occurrence of the target words (horse and run) with the context words animal, stable, and so on.
co-occurrence is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Titov, Ivan and McDonald, Ryan
Experiments
The first factor, 77%,” expresses a preference for topics likely from the co-occurrence information, whereas the second one, pig, favors the choice of topics which are predictive of the observable sentiment ratings.
Introduction
First, ratable aspects normally represent coherent topics which can be potentially discovered from co-occurrence information in the text.
The Model
Importantly, the fact that windows overlap permits the model to exploit a larger co-occurrence domain.
co-occurrence is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: