Contextual word embeddings are a type of word representation that captures the context of a word in a sentence. These embeddings are trained on large corpora and can be used to improve performance on tasks like sentiment analysis and language modeling.