id author title date pages extension mime words sentence flesch summary cache txt 6969z032q4p Bryan (Ning) Xia Learning Attentive Deep Representations for Object Re-Identification and Beyond 2021 .txt text/plain 347 12 24 However, the designed attention mechanisms are designed from the root of solving object re-identification problems, they are also deep learning mechanisms with the capability of solving other problems. The attention mechanisms are versatile for solving a multi-label text classification problem, which takes advantage of the attention mechanism's capability of modeling pair-wise relationships to model the text representations and label representations. cache/6969z032q4p.txt txt/6969z032q4p.txt