id author title date pages extension mime words sentences flesch summary cache txt work_6lltcw7mj5effmke754i7v26s4 Dominique Osborne Encoding Prior Knowledge with Eigenword Embeddings 2016 14 .pdf application/pdf 9326 1186 73 Such word embeddings have achieved state-of-theart performance on many natural language processing (NLP) tasks, e.g., syntactic parsing (Socher et For example, in dependency parsing, word embeddings could be tailored to capture similarity in terms of context within (2015) have proposed to use canonical correlation analysis (CCA) as a method to learn lowdimensional real vectors, called Eigenwords. This use of CCA to derive word embeddings Figure 3: The CCA-like algorithm that returns word embeddings with prior knowledge encoded based on a similarity graph. Table 1: Results for the word similarity datasets, geographic analogies and NP bracketing. We evaluated the quality of our eigenword embeddings on three different tasks: word similarity, geographic analogies and NP bracketing. into 9 distinct blocks, labeled A through I.) In general, adding prior knowledge to eigenword embeddings does improve the quality of word vectors for FrameNet as a prior knowledge resource for improving the quality of word embeddings is not as helpful ./cache/work_6lltcw7mj5effmke754i7v26s4.pdf ./txt/work_6lltcw7mj5effmke754i7v26s4.txt