Tokenization
Home
>
Linguistics
>
Corpus linguistics
>
Lexical Analysis
>
Tokenization
Separating words or smaller linguistic units (tokens) from a text.