Pre-trained Models

Home > Computer Science > Natural Language Processing > Word Embeddings > Pre-trained Models

Pre-trained word embeddings are embeddings that have already been trained on large corpora and are available for use in NLP tasks. Some popular pre-trained models include Google's Word2Vec, Stanford's GloVe, and Facebook's fastText.