Word embeddings are vector representations that encode both lexical and semantic information for each word in a language. Word embeddings play a significant role in machine translation since embedding vectors in a common space based on their underlying meanings can enhance translation accuracy.