Tokenization

Home > Computer Science > Natural Language Processing > Text Preprocessing > Tokenization

The process of splitting text into smaller chunks, typically words, phrases or sentences.