lexing or tokenization

Home > Computer Science > Natural Language Processing > Parsing > lexing or tokenization

The process of breaking the input text into words or tokens to allow for further processing.