id author title date pages extension mime words sentences flesch summary cache txt work_3ecslqollvat3jyruwpjy26lou Zaixiang Zheng Modeling Past and Future for Neural Machine Translation 2018 14 .pdf application/pdf 6897 905 70 Modeling Past and Future for Neural Machine Translation are fed to both the attention model and the decoder states, which provides Neural Machine In addition to PRESENT, we address the importance of modeling PAST and FUTURE contents in step, and the RNN state of the FUTURE layer corresponds to source contents of untranslated words. We then feed the PAST and FUTURE information to both the attention model and decoder states. to explicitly model the holistic source summarization by PAST and FUTURE contents at each decoding step. In addition, we use two separate layers to explicitly model translated and untranslated contents, which is updated at each decoding step by subtracting the source content being translated (i.e., attention vector) from the last state (i.e., the untranslated while the PAST layer encodes translated source contents up to the current step. subtract the semantics being translated from the untranslated FUTURE contents at each decoding step. ./cache/work_3ecslqollvat3jyruwpjy26lou.pdf ./txt/work_3ecslqollvat3jyruwpjy26lou.txt