Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface EdgeNGramTokenizer

Package version

Tokenizes the input from an edge into n-grams of the given size(s). This tokenizer is implemented using Apache Lucene.

Hierarchy

  • EdgeNGramTokenizer

Index

Properties

Optional maxGram

maxGram: undefined | number

The maximum n-gram length. Default is 2. Maximum is 300. Default value: 2.

Optional minGram

minGram: undefined | number

The minimum n-gram length. Default is 1. Maximum is 300. Must be less than the value of maxGram. Default value: 1.

name

name: string

The name of the tokenizer. It must only contain letters, digits, spaces, dashes or underscores, can only start and end with alphanumeric characters, and is limited to 128 characters.

odatatype

odatatype: "#Microsoft.Azure.Search.EdgeNGramTokenizer"

Polymorphic Discriminator

Optional tokenChars

tokenChars: TokenCharacterKind[]

Character classes to keep in the tokens.

Generated using TypeDoc