TY - JOUR
T1 - A Primer in BERTology: What We Know About How BERT Works
AU - Rogers, Anna
AU - Kovaleva, Olga
AU - Rumshisky, Anna
PY - 2020/12/1
Y1 - 2020/12/1
N2 - Transformer-based models have pushed state of the art in many areas of NLP, but our understanding of what is behind their success is still limited. This paper is the first survey of over 150 studies of the popular BERT model. We review the current state of knowledge about how BERT works, what kind of information it learns and how it is represented, common modifications to its training objectives and architecture, the overparameterization issue, and approaches to compression. We then outline directions for future research.
AB - Transformer-based models have pushed state of the art in many areas of NLP, but our understanding of what is behind their success is still limited. This paper is the first survey of over 150 studies of the popular BERT model. We review the current state of knowledge about how BERT works, what kind of information it learns and how it is represented, common modifications to its training objectives and architecture, the overparameterization issue, and approaches to compression. We then outline directions for future research.
KW - Transformer-based models
KW - BERT model analysis
KW - Neural network overparameterization
KW - Training objective modifications
KW - Model compression techniques
M3 - Journal article
SN - 2307-387X
VL - 8
SP - 842
EP - 866
JO - Transactions of the Association for Computational Linguistics
JF - Transactions of the Association for Computational Linguistics
ER -