Attention

Machine learning

https://en.wikipedia.org/wiki/Attention_(machine_learning)

Attention mechanism was developed to address the weaknesses of leveraging information from the hidden layers of recurrent (复发性) neural networks. Recurrent neural networks favor more recent information contained in words at the end of a sentence, while information earlier in the sentence tends to be attenuated. Attention allows a token equal access to any part of a sentence directly, rather than only through the previous state.

https://en.wikipedia.org/wiki/Attention_Is_All_You_Need

Person

最近分神的怪毛病又开始缠上我了,往往是资料查到一半,休息相关的侥幸歹念一起,就打开了不相干的内容 (Twitter,Rss),麻木地沉浸其中,虽说平均一次只有五六分钟,也不长,但是频率大了整个人回过神来却已经忘记之前是要做什么的了.

我时常感觉 社交媒体 就是一个巨大的黑洞,他知晓我对什么感兴趣。知道什么戳我的点,总是在给我几张色图之后再抛给我几个资源,我乐在其中,再等我看倦后突然抛给我一众 dalao 犀利的头脑风暴,再度膜拜,得,又是一次思想的洗礼😅。有时刷到什么金句就匆忙记下,久而久之脑袋中就充斥着这些七七八八无从循迹的声音,让人头痛不已.