Definitions for "Tokenisation"
the process or result of dividing a text or list of words into tokens.
the process of analysing a string into a contiguous sequence of smaller units: for example, word breaking or syllable breaking or the creation of a sort key.