Index of papers in Proc. ACL 2009 that mention
  • overfitting
Dwyer, Kenneth and Kondrak, Grzegorz
Context ordering
By biasing the decision tree learner toward questions that are intuitively of greater utility, we make it less prone to overfitting on small data samples.
Results
5 The idea of lowering the specificity of letter class questions as the context length increases is due to Kienappel and Kneser (2001), and is intended to avoid overfitting .
Results
Our expectation was that context ordering would be particularly helpful during the early rounds of active learning, when there is a greater risk of overfitting on the small training sets.
overfitting is mentioned in 3 sentences in this paper.
Topics mentioned in this paper:
Sun, Xu and Okazaki, Naoaki and Tsujii, Jun'ichi
Abbreviator with Nonlocal Information
The first term expresses the conditional log-likelihood of the training data, and the second term represents a regularizer that reduces the overfitting problem in parameter estimation.
Abbreviator with Nonlocal Information
Since the number of letters in Chinese (more than 10K characters) is much larger than the number of letters in English (26 letters), in order to avoid a possible overfitting problem, we did not apply these feature templates to Chinese abbreviations.
Experiments
To reduce overfitting , we employed a L2 Gaussian weight prior (Chen and Rosenfeld, 1999), with the objective function: MG) = 221:110gP(yz|Xi,@)-||@||2/02-Dur-ing training and validation, we set 0 = 1 for the DPLVM generators.
overfitting is mentioned in 3 sentences in this paper.
Topics mentioned in this paper: