ChatTemplate class
Defines how to format chat messages for a specific model.
Each LLM family expects a different prompt structure. A ChatTemplate encodes the special tokens / delimiters that surround system, user, and assistant messages so the model generates coherent multi-turn replies.
The {text} placeholder in each template string is replaced with the
actual message content at formatting time.
const zephyr = ChatTemplate(
system: '<|system|>\n{text}</s>\n',
user: '<|user|>\n{text}</s>\n',
assistant: '<|assistant|>\n{text}</s>\n',
);
Constructors
- ChatTemplate({required String system, required String user, required String assistant, String systemDefault = 'You are a helpful assistant.'})
-
Creates a ChatTemplate with the given format strings.
const
Properties
- assistant → String
-
Template wrapping each assistant message. Must contain
{text}.final - hashCode → int
-
The hash code for this object.
no setterinherited
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
- system → String
-
Template wrapping the system prompt. Must contain
{text}.final - systemDefault → String
-
Fallback system prompt used when none is provided by the caller.
final
- user → String
-
Template wrapping each user message. Must contain
{text}.final
Methods
-
format(
{String? systemPrompt, required List< ({String role, String text})> messages}) → String -
Renders a full prompt from a list of conversation
messages. -
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited