Query Classification | For each query class, we train a logistic regression classifier (Vapnik 1999) with L2 regularization. |
Query Classification | Given an input x, represented as a vector of m features: (x1, x2, xm), a logistic regression classifier with parameter vector w =(w1, w2, wm) computes the posterior probability of the output y, which is either 1 or -1, as 1 190’ Ix) - W |
Query Classification | We made a small modification to the objective function for logistic regression to take into account the prior distribution and to use 50% as a uniform decision boundary for all the classes. |
Abstract | Furthermore, using a product of experts (Hinton, 2002), we combine the model with a complementary logistic regression model based on state-of-the-art lexical overlap features. |
Introduction | We use a product of experts (Hinton, 2002) to bring together a logistic regression classifier built from n-gram overlap features and our syntactic model. |
Product of Experts | Probabilistic Lexical Overlap Model We devised a logistic regression (LR) model incorporating 18 simple features, computed directly from 51 and 52, without modeling any hidden correspondence. |
Abstract | In Markov logic framework, logistic regression based data-driven user intention modeling is introduced, and human dialog knowledge are designed into two layers such as domain and discourse knowledge, then it is integrated with the data-driven model in generation time. |
Overall architecture | The formulas for user intention modeling based on logistic regression |
Overall architecture | A logistic regression model is used for the statistical user intention model in Markov logic. |