id author title date pages extension mime words sentences flesch summary cache txt work_q3s75ybxv5gppoolp5qemw6apq Guy Rotman Deep Contextualized Self-training for Low Resource Dependency Parsing 2019 19 .pdf application/pdf 11291 1105 67 Deep Contextualized Self-training for Low Resource Dependency Parsing We present a novel self-training method, suitable for neural dependency parsing. lightly supervised and domain adaptation dependency parsing setups. adaptation case we consider 16 setups: 6 in different English domains and 10 in 5 other languages. embedding model is trained on sequence tagging Pre-training and Deep Contextualized Embedding Our DCST algorithm is related to recent 2008), adding inter-sentence consistency constraints at test time (Rush et al., 2012), selecting effective training domains (Plank and Van Noord, Their approach requires target domain labeled data for parser training and hence In the domain adaptation setups Base-FS is trained trained as a language model (DCST-LM). Table 1: Lightly supervised OntoNotes results with 500 training sentences. DCST with Syntactic Self-training DCSTENS, our model that integrates all three parser's trees (or as a language model for DCST-LM) based on the integration of (a) contextualized embedding model(s) into a neural dependency parser, ./cache/work_q3s75ybxv5gppoolp5qemw6apq.pdf ./txt/work_q3s75ybxv5gppoolp5qemw6apq.txt