Index of papers in Proc. ACL that mention
  • structural features
Sun, Jun and Zhang, Min and Tan, Chew Lim
Abstract
Our study reveals that the structural features embedded in a bilingual parse tree pair are very effective for subtree alignment and the bilingual tree kernels can well capture such features.
Introduction
However, most of the syntax based systems construct the syntactic translation rules based on word alignment, which not only suffers from the pipeline errors, but also fails to effectively utilize the syntactic structural features .
Introduction
These works fail to utilize the structural features , rendering the syntactic rich task of subtree alignment less convincing and attractive.
Introduction
Along with BTKs, various lexical and syntactic structural features are proposed to capture the correspondence between bilingual sub-trees using a polynomial kernel.
Substructure Spaces for BTKs
Besides BTKs, we introduce various plain lexical features and structural features which can be expressed as feature functions.
Substructure Spaces for BTKs
The plain syntactic structural features can deal with the structural divergence of bilingual parse trees in a more general perspective.
Substructure Spaces for BTKs
4.2 Online Structural Features
structural features is mentioned in 14 sentences in this paper.
Topics mentioned in this paper:
Pasupat, Panupong and Liang, Percy
Approach
The final feature vector is the concatenation of structural features gbs(w,z), which consider the selected nodes in the DOM tree, and denotation features gbd(:c, y), which look at the extracted entities.
Approach
One main focus of our work is finding good feature representations for a list of objects (DOM tree nodes for structural features and entity strings for denotation features).
Approach
Structural feature Value Features on selected nodes:
Experiments
Setting Acc A@5 All features 41.1 :I: 3.4 58.4 :I: 2.7 Oracle 68.7 :I: 2.4 68.7 :I: 2.4 (Section 4.5) Structural features only 36.2 :I: 1.9 54.5 :I: 2.5 Denotation features only 19.8 :I: 2.5 41.7 :I: 2.7 (Section 4.6) Structural + query-denotation 41.7 :I: 2.5 58.1 :I: 2.4 Query-denotation features only 25.0 :I: 2.3 48.0 :I: 2.7 Concat.
Experiments
We observe that denotation features improves accuracy on top of structural features .
Experiments
On the other hand, structural features prevent the system from selecting random entities outside the main part of the page.
Introduction
To generalize across different inputs, we rely on two types of features: structural features , which look at the layout and placement of the entities being extracted; and denotation fea-
structural features is mentioned in 13 sentences in this paper.
Topics mentioned in this paper:
Wang, Zhiguo and Xue, Nianwen
Introduction
Therefore, it runs in linear time and can take advantage of arbitrarily complex structural features from already constructed subtrees.
Introduction
Third, transition-based parsers have the freedom to define arbitrarily complex structural features, but this freedom has not fully been taken advantage of and most of the present approaches only use simple structural features .
Introduction
Third, we take into account two groups of complex structural features that have not been previously used in transition-based parsing: nonlocal features (Charniak and Johnson, 2005) and semi-supervised word cluster features (Koo et al., 2008).
Joint POS Tagging and Parsing with Nonlocal Features
One advantage of transition-based constituent parsing is that it is capable of incorporating arbitrarily complex structural features from the already constructed subtrees in 0 and unprocessed words in 6.
Joint POS Tagging and Parsing with Nonlocal Features
However, all the feature templates given in Table l are just some simple structural features .
Joint POS Tagging and Parsing with Nonlocal Features
To further improve the performance of our transition-based constituent parser, we consider two group of complex structural features : nonlocal features (Chamiak and Johnson, 2005; Collins and Koo, 2005) and semi-supervised word cluster features (Koo et al., 2008).
Related Work
The reason is that the single-stage chart-based parser cannot use nonlocal structural features .
Related Work
In contrast, the transition-based parser can use arbitrarily complex structural features .
structural features is mentioned in 8 sentences in this paper.
Topics mentioned in this paper:
P, Deepak and Visweswariah, Karthik
Abstract
Our technique is designed to not rely much on structural features such as post metadata since such features are often not uniformly available across forums.
Conclusions and Future Work
We show that our technique is able to effectively identify solutions using just one non-content based feature, the post position, whereas previous techniques in literature have depended heavily on structural features (that are not always available in many forums) and supervised information.
Experimental Evaluation
Thus, our technique is able to exploit any extra solution identifying structural features that are available.
Introduction
Though such assumptions on structural features, if generic enough, may be built into unsupervised techniques to aid solution identification, the variation in availability of such features across forums limits the usage of models that rely heavily on structural features .
Introduction
In particular, we show that by using post position as the only non-textual feature, we are able to achieve accuracies comparable to supervision-based approaches that use many structural features (Catherine et al., 2013).
Our Approach
Towards this, we make use of a structural feature ; in particular, adapting the hypothesis that solutions occur in the first N posts (Ref.
Our Approach
We will show that we are able to effectively perform solution identification using our approach by exploiting just one structural feature , the post position, as above.
structural features is mentioned in 7 sentences in this paper.
Topics mentioned in this paper:
Ye, Shiren and Chua, Tat-Seng and LU, Jie
Background
In order to extract some salient sentences from the article as definition summaries, we will build a summarization model that describes the relations between the sentences, where both textual and structural features are considered.
Conclusion and Future Work
Wikipedia’s special structural features , such as wiki links, infobox and outline, reflect the hidden human knowledge.
Experiments
Based on the performance of EDCL for TREC—QA definition task listed in Table 5, we observe that: (i) When EDCL considers wiki concepts and structural features such as outline and infobox, its F—scores increase significantly (Run 3 and Run 4).
Introduction
The rest of this paper is organized as follows: In the next section, we discuss the background of summarization using both textual and structural features .
Our Approach 3.1 Wiki Concepts
Different from free text and general web documents, wiki articles contain structural features , such as infoboxes and outlines, which correlate strongly to nuggets in definition TREC—QA.
Our Approach 3.1 Wiki Concepts
By integrating these structural features , we will generate better RP measures in derived topics which facilitates better priority assignment in local topics.
structural features is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Prabhakaran, Vinodkumar and Rambow, Owen
Abstract
We propose a new set of structural features .
Predicting Direction of Power
In order to mitigate this issue, we use an indicator feature for each structural feature to denote whether or not it is valid.
Predicting Direction of Power
The performance of the system using each structural feature class on its own is very low.
Predicting Direction of Power
Perplexingly, adding all structural features to LEX reduces the accuracy by around 2.2 percentage points.
structural features is mentioned in 5 sentences in this paper.
Topics mentioned in this paper:
Konstas, Ioannis and Lapata, Mirella
Experimental Design
Structural Features Features in this category target primarily content selection and influence appropriate choice at the field level:
Problem Formulation
2We also store field information to compute structural features , described in Section 4.2.
Results
Addition of the structural features further boosts performance.
Results
We tackle this issue with the inclusion of nonlocal structural features .
structural features is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Minkov, Einat and Zettlemoyer, Luke
Corporate Acquisitions
While newswire documents are mostly unstructured, structural features are used to indicate whether any of the purchaser, acquired and seller text spans appears in
Corporate Acquisitions
Removing the inter type and structural features mildly hurt performance, on average.
Seminar Extraction Task
No structural features No semantic features No unification Individual fields
Seminar Extraction Task
As shown in the table, removing the structural features hurt performance consistently across fields.
structural features is mentioned in 4 sentences in this paper.
Topics mentioned in this paper:
Ding, Shilin and Cong, Gao and Lin, Chin-Yew and Zhu, Xiaoyan
Context and Answer Detection
Structural features:
Context and Answer Detection
The structural features of forums provide strong clues for contexts.
Experiments
We found that similarity features are the most important and structural feature the next.
structural features is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Lin, Shih-Hsiang and Chen, Berlin
Background
A spoken sentence Si is characterized by set of T indicative features X i ={xi1,---,xiT}, and they may include lexical features (Koumpis and Renals, 2000), structural features (Maskey and Hirschberg, 2003), acoustic features (Inoue et al., 2004), discourse features (Zhang et al., 2007) and relevance features (Lin et al., 2009).
Experimental setup 5.1 Data
Structural features
Experimental setup 5.1 Data
The input to BC consists of a set of 28 indicative features used to characterize a spoken sentence, including the structural features , the lexical features, the acoustic features and the relevance feature.
structural features is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Wang, Baoxun and Wang, Xiaolong and Sun, Chengjie and Liu, Bingquan and Sun, Lin
Introduction
Most researchers try to introduce structural features or users’ behavior to improve the models performance, by contrast, the effect of textual features is not obvious.
Related Work
The structural features (e. g., authorship, acknowledgement, post position, etc), also called non-textual features, play an important role in answer extraction.
Related Work
(2009) show that the structural features have even more contribution than the textual features.
structural features is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Chen, Xiao and Kit, Chunyu
Higher-order Constituent Parsing
Some back-off structural features are used for smoothing, which cannot be presented due to limited space.
Higher-order Constituent Parsing
Adding structural features , each involving a least a neighboring rule instance, makes it a higher-order parsing model.
Higher-order Constituent Parsing
Because all structures above the current rule instance is not determined yet, the computation of its nonlocal structural features , e. g., parent and sibling features, has to be delayed until it joins an upper level structure.
structural features is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Severyn, Aliaksei and Moschitti, Alessandro and Uryupina, Olga and Plank, Barbara and Filippova, Katja
Experiments
The structural model, in contrast, is able to identify the product of interest (xoom) and associate it with the negative expression through a structural feature and thus correctly classify the comment as negat ive.
Related work
In contrast, we show that adding structural features from syntactic trees is particularly useful for the cross-domain setting.
Representations and models
These trees are input to tree kernel functions for generating structural features .
structural features is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: