The maximum token length. Default is 255. Tokens longer than the maximum length are split. The maximum token length that can be used is 300 characters. Default value: 255.
The name of the analyzer. It must only contain letters, digits, spaces, dashes or underscores, can only start and end with alphanumeric characters, and is limited to 128 characters.
Polymorphic Discriminator
A list of stopwords.
Generated using TypeDoc
Standard Apache Lucene analyzer; Composed of the standard tokenizer, lowercase filter and stop filter.