id author title date pages extension mime words sentences flesch summary cache txt work_cwuwqujlcjb2hduzqfbskahykq Rico Sennrich Modelling and Optimizing on Syntactic N-Grams for Statistical Machine Translation 2015 14 .pdf application/pdf 8729 839 61 promote fluent translation output, but traditional n-gram language models are unable to log-linear parameters of an SMT system further increases translation quality when coupled with a syntactic language model. syntactic evaluation metric for optimizing the loglinear parameters of the SMT model. Section 2 describes our relational dependency language model; Figure 1: Translation output of baseline English→German string-to-tree SMT system with original dependency representation and conversion into constituency representation. Rather than directly comparing perplexity between different models, our focus lies on a perplexity comparison between a human reference translation and the 1-best SMT output of a baseline translaTranslation results for English→German with different language models added to our baseline are Table 3: Translation quality of English→German string-to-tree SMT system with different language models, with kIf we use BLEU+HWCMf as our tuning objective, the difference between the models increases. n-Gram and Dependency Language Models. Neural Language Models Improves Translation. ./cache/work_cwuwqujlcjb2hduzqfbskahykq.pdf ./txt/work_cwuwqujlcjb2hduzqfbskahykq.txt