JsonTokenizer typedef

JsonTokenizer = Future<List<Token>> Function(Map<String, dynamic> document, {NGramRange? nGramRange, TokenFilter? tokenFilter, Iterable<String>? zones})

Type definition of a function that returns a collection of Token from the zones in a JSON document.

Extracts tokens from the zones in a JSON document for use in full-text search queries and indexes.

  • document is a JSON document containing the zones as keys;
  • nGramRange is the range of N-gram lengths to generate. If nGramRange is null, only keyword phrases are generated;
  • tokenFilter is a filter function that returns a subset of a collection of Tokens; and
  • zones is the collection of the names of the zones in document that are to be tokenized.

Returns a List<Token>.

Implementation

typedef JsonTokenizer = Future<List<Token>> Function(
    Map<String, dynamic> document,
    {NGramRange? nGramRange,
    Iterable<String>? zones,
    TokenFilter? tokenFilter});