id author title date pages extension mime words sentences flesch summary cache txt work_ocrpsnsynrh2nb7fy3scvt4ltu Matthew R. Gormley Approximation-Aware Dependency Parsing by Belief Propagation 2015 14 .pdf application/pdf 8679 1097 71 Approximation-Aware Dependency Parsing by Belief Propagation Recent improvements to dependency parsing accuracy have been driven by higher-order features. parsers depend on approximate inference and decoding procedures, which may prevent them from predicting the best parse. In contrast, we train the parser such that the approximate system performs well on the final evaluation function. We treat the entire parsing computation as a differentiable circuit, and backpropagate the evaluation function through our approximate inference and decoding methods to improve models, Stoyanov and Eisner (2012) call this approach ERMA, for "empirical risk minimization under approximations." For objectives besides empirical risk, Domke (2011) refers to it as "learning with approximation-aware learning method in the parsing setting, for which the graphical model involves to those probabilities.3 When our inference algorithm is approximate, we replace the exact marginal and Eisner (2008)—is exact for first-order dependency parsing. regularized objective function over the training sample of (sentence, parse) pairs {(x(d),y(d))}Dd=1. ./cache/work_ocrpsnsynrh2nb7fy3scvt4ltu.pdf ./txt/work_ocrpsnsynrh2nb7fy3scvt4ltu.txt