Options
All
  • Public
  • Public/Protected
  • All
Menu

Interface AnalyzeRequest

Package version

Specifies some text and analysis components used to break that text into tokens. Specifies some text and analysis components used to break that text into tokens.

Hierarchy

  • AnalyzeRequest

Index

Properties

Optional analyzer

analyzer: undefined | string

The name of the analyzer to use to break the given text. KnownAnalyzerNames is an enum containing known values.

Optional analyzerName

analyzerName: undefined | string

The name of the analyzer to use to break the given text. If this parameter is not specified, you must specify a tokenizer instead. The tokenizer and analyzer parameters are mutually exclusive. KnownAnalyzerNames is an enum containing known values. NOTE: Either analyzerName or tokenizerName is required in an AnalyzeRequest.

Optional charFilters

charFilters: string[]

An optional list of character filters to use when breaking the given text. An optional list of character filters to use when breaking the given text. This parameter can only be set when using the tokenizer parameter.

Optional normalizer

The name of the normalizer to use to normalize the given text.

Optional normalizerName

normalizerName: LexicalNormalizerName

The name of the normalizer to use to normalize the given text.

text

text: string

The text to break into tokens. The text to break into tokens.

Optional tokenFilters

tokenFilters: string[]

An optional list of token filters to use when breaking the given text. An optional list of token filters to use when breaking the given text. This parameter can only be set when using the tokenizer parameter.

Optional tokenizer

tokenizer: undefined | string

The name of the tokenizer to use to break the given text. KnownTokenizerNames is an enum containing known values.

Optional tokenizerName

tokenizerName: undefined | string

The name of the tokenizer to use to break the given text. If this parameter is not specified, you must specify an analyzer instead. The tokenizer and analyzer parameters are mutually exclusive. KnownTokenizerNames is an enum containing known values. NOTE: Either analyzerName or tokenizerName is required in an AnalyzeRequest.

Generated using TypeDoc