BERT

Home > Computer Science > Natural Language Processing > Word Embeddings > BERT

Bidirectional Encoder Representations from Transformers (BERT) is a deep learning model that generates contextualized word embeddings using a self-supervised technique. It uses a masked language model to predict the missing words in a sentence.