The attention mechanisms allowed the model to focus on what mattered most in the input sequence, in the way that human readers attended most closely to the words that were most relevant to the meaning of a passage.
注意力机制使模型能够专注于输入序列中最重要的内容,就像人类读者最密切地关注与段落含义最相关的单词一样。