Old Web
English
Sign In
Acemap
>
Paper
>
DeepNet: Scaling Transformers to 1, 000 Layers.
DeepNet: Scaling Transformers to 1, 000 Layers.
2022
Hongyu Wang
Shuming Ma
Li Dong
Shaohan Huang
Dongdong Zhang
Furu Wei
Correction
Cite
Save
Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI
[]