id author title date pages extension mime words sentences flesch summary cache txt work_riaql7nykvgypmnszeqnlhdrlu Mo Yu Learning Composition Models for Phrase Embeddings 2015 16 .pdf application/pdf 10112 1058 70 Word embeddings learned by neural language models (Bengio et al., 2003; Collobert and Weston, We propose a new method for compositional semantics that learns to compose word embeddings We learn transformations for composing phrase embeddings from the component words based on extracted features from a phrase, where we assume FCT has two sets of parameters: one is the feature weights (α,b), the other is word embeddings (language modeling) we train FCT so that phrase embeddings – as composed in Section 2 – predict contextual words, an extension of the skip-gram objective (Mikolov et al., 2013b) to phrases. where α,b,ew are parameters (the word embeddings ew become parameters when fine-tuning is enabled) of FCT model defined in Section 2. et al., 2013a) for learning task-specific word embeddings, first training FCT and the embeddings with the LM objective and then fine-tuning the word embeddings using labeled data for the target task. ./cache/work_riaql7nykvgypmnszeqnlhdrlu.pdf ./txt/work_riaql7nykvgypmnszeqnlhdrlu.txt