id author title date pages extension mime words sentences flesch summary cache txt work_aawrwejmaregji2stj32j4azfq Dingquan Wang Fine-Grained Prediction of Syntactic Typology: Discovering Latent Structure with Supervised Learning 2017 16 .pdf application/pdf 10033 1300 71 as supervised learning, using a large collection of realistic synthetic languages as training data. (hand-engineered or neural features) that correlate with the language's deeper structure (latent trees). Table 1: Three typological properties in the World Atlas of Language Structures (Dryer and Haspelmath, 2013), and how they the non-IID problem that the available OVS languages may be evolutionarily related.1 We mitigate this issue by training on the Galactic Dependencies treebanks (Wang and Eisner, 2016), a collection of more than 50,000 human-like synthetic 1Properties shared within an OVS language family may appear to be consistently predictive of OVS, but are actually confounds that will not generalize to other families in test data. To score all dependency relation types given the corpus u, we use a feed-forward neural network with the model to predict pāˆ—train from utrain on each training language. Table 3: Average expected loss over 20 UD languages, computed by 5-fold cross-validation. of relation directionality for each training language. ./cache/work_aawrwejmaregji2stj32j4azfq.pdf ./txt/work_aawrwejmaregji2stj32j4azfq.txt