id author title date pages extension mime words sentences flesch summary cache txt work_nkdki33fzvgtxnrotut7r4jdqe Daniel Beck Learning Structural Kernels for Natural Language Processing 2015 14 .pdf application/pdf 7883 873 69 mostly rely on setting default values for kernel hyperparameters or using grid search, Our proposed approach for model selection relies on Gaussian Processes (GPs) (Rasmussen and allows us to easily propose new rich kernel extensions that rely on a large number of hyperparameters, which in turn can result in better • Since the model selection process is now finegrained, we can interpret the resulting hyperparameter values, depending on how the kernel is data (§4) and two real NLP regression tasks: Emotion Analysis (§5.1) and Translation Quality Estimation (§5.2). Table 1: Resulting fragment weighted counts for the kernel evaluation k(t,t), for different values of hyperparameters, where t is the tree in Figure 1. and SVM models we employ the SSTK as the kernel Our experiments with NLP data address two regression tasks: Emotion Analysis and Quality Estimation. Models We perform experiments using the following tree kernels: our models in this task use a pair of tree kernels. ./cache/work_nkdki33fzvgtxnrotut7r4jdqe.pdf ./txt/work_nkdki33fzvgtxnrotut7r4jdqe.txt