Tokenizer typedef

Tokenizer = Future<List<Token>> Function(String source, {NGramRange? nGramRange, TokenFilter? tokenFilter, String? zone})

Type definition of a function that returns a collection of Token from the source text.

Extracts one or more tokens from source for use in full-text search queries and indexes.

  • source is a String that will be tokenized;
  • nGramRange is the range of N-gram lengths to generate. If nGramRange is null, only keyword phrases are generated;
  • tokenFilter is a filter function that returns a subset of a collection of Tokens; and
  • zone is the of the name of the zone to be appended to the Tokens.

Returns a List<Token>.

Implementation

typedef Tokenizer = Future<List<Token>> Function(String source,
    {NGramRange? nGramRange, String? zone, TokenFilter? tokenFilter});