Index of papers in Proc. ACL 2008 that mention
  • learning algorithms
Davidov, Dmitry and Rappoport, Ari
Experimental Setup
5.3 Parameters and Learning Algorithm
Experimental Setup
Selection of learning algorithm and its algorithm-specific parameters were done as follows.
Experimental Setup
Since each dataset has only 140 examples, the computation time of each learning algorithm is negligible.
Introduction
The standard classification process is to find in an auxiliary corpus a set of patterns in which a given training word pair co-appears, and use pattern-word pair co-appearance statistics as features for machine learning algorithms .
Related Work
Various learning algorithms have been used for relation classification.
Related Work
Freely available tools like Weka (Witten and Frank, 1999) allow easy experimentation with common learning algorithms (Hendrickx et al., 2007).
learning algorithms is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Wang, Qin Iris and Schuurmans, Dale and Lin, Dekang
Abstract
By combining a supervised large margin loss with an unsupervised least squares loss, a dis-criminative, convex, semi-supervised learning algorithm can be obtained that is applicable to large-scale problems.
Conclusion and Future Work
Unlike previous proposed approaches, we introduce a convex objective for the semi-supervised learning algorithm by combining a convex structured SVM loss and a convex least square loss.
Introduction
Supervised learning algorithms still represent the state of the art approach for inferring dependency parsers from data (McDonald et al., 2005a; McDonald and Pereira, 2006; Wang et al., 2007).
Introduction
Unfortunately, although significant recent progress has been made in the area of semi-supervised learning, the performance of semi-supervised learning algorithms still fall far short of expectations, particularly in challenging real-world tasks such as natural language parsing or machine translation.
Introduction
The basic idea is to bootstrap a supervised learning algorithm by alternating between inferring the missing label information and retraining.
learning algorithms is mentioned in 6 sentences in this paper.
Topics mentioned in this paper:
Yang, Xiaofeng and Su, Jian and Lang, Jun and Tan, Chew Lim and Liu, Ting and Li, Sheng
Entity-mention Model with ILP
However, normal machine learning algorithms work on attribute-value vectors, which only allows the representation of atomic proposition.
Entity-mention Model with ILP
This requirement motivates our use of Inductive Logic Programming (ILP), a learning algorithm capable of inferring logic programs.
Experiments and Results
Default parameters were applied for all the other settings in ALEPH as well as other learning algorithms used in the experiments.
Introduction
Even worse, the number of mentions in an entity is not fixed, which would result in variant-length feature vectors and make trouble for normal machine learning algorithms .
Modelling Coreference Resolution
Based on the training instances, a binary classifier can be generated using any discriminative learning algorithm .
learning algorithms is mentioned in 5 sentences in this paper.
Topics mentioned in this paper: