Tokenization
Home
>
Languages
>
Natural Language
>
Text Classification
>
Tokenization
The process of dividing a text into smaller units or tokens, which are usually words or sentences.