Old Web
English
Sign In
Acemap
>
Paper
>
Infinite attention: NNGP and NTK for deep attention networks. (arXiv:2006.10540v1 [stat.ML])
Infinite attention: NNGP and NTK for deep attention networks. (arXiv:2006.10540v1 [stat.ML])
2020
Jiří Hron
Yasaman Bahri
Jascha Sohl-Dickstein
Roman Novak
Keywords:
stat
Physics
Discrete mathematics
Correction
Cite
Save
Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI
[]