id author title date pages extension mime words sentences flesch summary cache txt work_kq5iakcovbdnjclxf3ichyagwy Eliyahu Kiperwasser Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations 2016 16 .pdf application/pdf 9458 947 63 BiLSTM is trained jointly with the parser objective, resulting in very effective feature extractors for parsing. The focus of this paper is on feature representation for dependency parsing, using recent techniques from the neural-networks ("deep learning") Graph-based parsers (McDonald, 2006) treat parsing as a search-based structured prediction problem in which the goal is learning a scoring function over dependency trees such that the correct tree by using the BiLSTM feature extractor in two parsing architectures, transition-based (Section 4) as The core features in a transition-based parser usually look at information such as the word-identity 3In all of these neural-network based approaches, the vector representations of words were initialized using pre-trained Beside using the BiLSTM-based feature functions, we make use of standard parsing techniques. Figure 2: Illustration of the neural model scheme of the graph-based parser when calculating the score of a given parse ./cache/work_kq5iakcovbdnjclxf3ichyagwy.pdf ./txt/work_kq5iakcovbdnjclxf3ichyagwy.txt