The maximum token length. Default is 255. Tokens longer than the maximum length are split. Default value: 255.
The name of the tokenizer. It must only contain letters, digits, spaces, dashes or underscores, can only start and end with alphanumeric characters, and is limited to 128 characters.
Polymorphic Discriminator
Generated using TypeDoc
Breaks text following the Unicode Text Segmentation rules. This tokenizer is implemented using Apache Lucene.