Index of papers in Proc. ACL 2011 that mention
  • feature space
Kobdani, Hamidreza and Schuetze, Hinrich and Schiehlen, Michael and Kamp, Hans
Conclusion
In addition, we showed that our system is a flexible and modular framework that is able to learn from data with different quality (perfect vs noisy markable detection) and domain; and is able to deliver good results for shallow information spaces and competitive results for rich feature spaces .
Introduction
Typical systems use a rich feature space based on lexical, syntactic and semantic knowledge.
Introduction
We view association information as an example of a shallow feature space which contrasts with the rich feature space that is generally used in CoRe.
Introduction
The feature spaces are the shallow and rich feature spaces .
Related Work
These researchers show that a “deterministic” system (essentially a rule-based system) that uses a rich feature space including lexical, syntactic and semantic features can improve CoRe performance.
Results and Discussion
To summarize, the advantages of our self-training approach are: (i) We cover cases that do not occur in the unlabeled corpus (better recall effect); and (ii) we use the leveraging effect of a rich feature space including distance, person, number, gender etc.
feature space is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
Mayfield, Elijah and Penstein Rosé, Carolyn
Background
We build a contextual feature space , described in section 4.2, to enhance our baseline bag-of-words model.
Background
4.2 Contextual Feature Space Additions
Background
0 Baseline: This model uses a bag-of-words feature space as input to an SVM classifier.
feature space is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
He, Yulan and Lin, Chenghua and Alani, Harith
Abstract
We study the polarity-bearing topics extracted by J ST and show that by augmenting the original feature space with polarity-bearing topics, the in-domain supervised classifiers learned from augmented feature representation achieve the state-of-the-art performance of 95% on the movie review data and an average of 90% on the multi-domain sentiment dataset.
Introduction
We study the polarity-bearing topics extracted by the JST model and show that by augmenting the original feature space with polarity-bearing topics, the performance of in-domain supervised classifiers learned from augmented feature representation improves substantially, reaching the state-of-the-art results of 95% on the movie review data and an average of 90% on the multi-domain sentiment dataset.
Joint Sentiment-Topic (J ST) Model
In this paper, we have studied polarity-bearing topics generated from the J ST model and shown that by augmenting the original feature space with polarity-bearing topics, the in-domain supervised classifiers learned from augmented feature representation achieve the state-of-the-art performance on both the movie review data and the multi-domain sentiment dataset.
Joint Sentiment-Topic (J ST) Model
First, polarity-bearing topics generated by the J ST model were simply added into the original feature space of documents, it is worth investigating attaching different weight to each topic
Related Work
proposed a kemel-mapping function which maps both source and target domains data to a high-dimensional feature space so that data points from the same domain are twice as similar as those from different domains.
feature space is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Bramsen, Philip and Escobar-Molano, Martha and Patel, Ami and Alonso, Rafael
Abstract
Because considering such features would increase the size of the feature space , we suspected that including these features would also benefit from algorithmic means of selecting n-grams that are indicative of particular lects, and even from binning these relevant n-grams into sets to be used as features.
Abstract
Although this approach to partitioning is simple and worthy of improvement, it effectively reduced the dimensionality of the feature space .
Abstract
Therefore, as we explored the feature space , small bins of different n-gram lengths were merged.
feature space is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: