ChatPromptTemplate class final
A prompt template for chat models.
Whereas LLMs take a string as prompt, Chat models take a list of messages. ChatPromptTemplate uses a list of template messages to generate the final prompt.
Each template message can be:
- SystemChatMessagePromptTemplate (for system messages)
- HumanChatMessagePromptTemplate (for human messages)
- AIChatMessagePromptTemplate (for AI messages)
- CustomChatMessagePromptTemplate (for custom role messages)
- MessagePlaceholder (for a single message placeholder)
- MessagesPlaceholder (for a list of messages placeholder)
Example:
final promptTemplate = ChatPromptTemplate.fromPromptMessages([
ChatMessagePromptTemplate.system("Here's some context: {context}"),
ChatMessagePromptTemplate.human("Hello {foo}, I'm {bar}. Thanks for the {context}"),
ChatMessagePromptTemplate.ai("I'm an AI. I'm {foo}. I'm {bar}."),
]);
final prompt = promptTemplate.formatPrompt({
'foo': 'GPT-4',
'bar': 'Gemini',
'context': 'competition',
});
final res = await chatModel.invoke(prompt);
If you prompt template only contains one message, you can use the convenient factory constructor ChatPromptTemplate.fromTemplate.
final promptTemplate = ChatPromptTemplate.fromTemplate("Hello {foo}, I'm {bar}. Thanks for the {context}");
If your prompt template contains multiple messages, you can use the convenient factory constructor ChatPromptTemplate.fromTemplates.
final promptTemplate = ChatPromptTemplate.fromTemplates([
(ChatMessageType.system, 'You are a helpful assistant that translates {input_language} to {output_language}.'),
(ChatMessageType.human, '{text}'),
]);
If you need a placeholder for a single message or a list of messages, you can use MessagePlaceholder or MessagesPlaceholder.
final promptTemplate = ChatPromptTemplate.fromTemplates([
(ChatMessageType.system, "You are a helpful AI assistant."),
(ChatMessageType.messagesPlaceholder, 'history'),
(ChatMessageType.messagePlaceholder, 'input'),
]);
In general, prefer using ChatPromptTemplate.fromTemplate and ChatPromptTemplate.fromTemplates to create a ChatPromptTemplate as the resulting code is more readable. Use the main ChatPromptTemplate constructor or ChatPromptTemplate.fromPromptMessages for advanced use cases.
- Inheritance
-
- Object
- Runnable<
InputValues, BaseLangChainOptions, PromptValue> - BasePromptTemplate
- BaseChatPromptTemplate
- ChatPromptTemplate
- Annotations
-
- @immutable
Constructors
-
ChatPromptTemplate({required Set<
String> inputVariables, PartialValues? partialVariables, required List<ChatMessagePromptTemplate> promptMessages}) -
A prompt template for chat models.
const
-
ChatPromptTemplate.fromPromptMessages(List<
ChatMessagePromptTemplate> promptMessages, {bool validateTemplate = true}) -
Creates a ChatPromptTemplate with a list of template messages.
factory
- ChatPromptTemplate.fromTemplate(String template, {ChatMessageType type = ChatMessageType.human, String? customRole, PartialValues? partialVariables, bool validateTemplate = true})
-
Creates a chat prompt template with a single message from a string
template.
factory
-
ChatPromptTemplate.fromTemplates(List<
(ChatMessageType, String)> messages, {String? customRole, PartialValues? partialVariables, bool validateTemplate = true}) -
Creates a ChatPromptTemplate from a list of pairs of chat message prompt template type and template.
factory
Properties
- defaultOptions → BaseLangChainOptions
-
The default options to use when invoking the Runnable.
finalinherited
- hashCode → int
-
The hash code for this object.
no setteroverride
-
inputVariables
→ Set<
String> -
A set of the names of the variables the prompt template expects.
finalinherited
- partialVariables → PartialValues?
-
Partial variables.
finalinherited
-
promptMessages
→ List<
ChatMessagePromptTemplate> -
The list of messages that make up the prompt template.
final
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
- type → String
-
The type of the prompt template.
no setteroverride
Methods
-
batch(
List< InputValues> inputs, {List<BaseLangChainOptions> ? options}) → Future<List< PromptValue> > -
Batches the invocation of the Runnable on the given
inputs
.inherited -
bind(
BaseLangChainOptions options) → RunnableBinding< InputValues, BaseLangChainOptions, PromptValue> -
Binds the Runnable to the given
options
.inherited -
close(
) → void -
Cleans up any resources associated with it the Runnable.
inherited
-
copyWith(
{Set< String> ? inputVariables, PartialValues? partialVariables, List<ChatMessagePromptTemplate> ? promptMessages}) → ChatPromptTemplate -
Creates a copy of this ChatPromptTemplate with the given fields.
override
-
format(
InputValues values) → String -
Format the prompt given the input values and return a formatted string.
inherited
-
formatMessages(
[InputValues values = const {}]) → List< ChatMessage> -
Format input values into a list of messages.
override
-
formatPrompt(
InputValues values) → PromptValue -
Format the prompt given the input values and return a formatted prompt
value.
inherited
-
getCompatibleOptions(
RunnableOptions? options) → BaseLangChainOptions? -
Returns the given
options
if they are compatible with the Runnable, otherwise returnsnull
.inherited -
invoke(
InputValues input, {BaseLangChainOptions? options}) → Future< PromptValue> -
Format the prompt given the input values and return a formatted prompt
value.
inherited
-
mergePartialAndUserVariables(
Map< String, dynamic> userVariables) → Map<String, Object> -
Merge the partial variables with the user variables.
inherited
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
partial(
PartialValues values) → BasePromptTemplate -
Return a partial of the prompt template.
override
-
pipe<
NewRunOutput extends Object?, NewCallOptions extends RunnableOptions> (Runnable< PromptValue, NewCallOptions, NewRunOutput> next) → RunnableSequence<InputValues, NewRunOutput> -
Pipes the output of this Runnable into another Runnable using a
RunnableSequence.
inherited
-
stream(
InputValues input, {BaseLangChainOptions? options}) → Stream< PromptValue> -
Streams the output of invoking the Runnable on the given
input
.inherited -
streamFromInputStream(
Stream< InputValues> inputStream, {BaseLangChainOptions? options}) → Stream<PromptValue> -
Streams the output of invoking the Runnable on the given
inputStream
.inherited -
toString(
) → String -
A string representation of this object.
override
-
validateTemplate(
) → void -
Validate the integrity of the prompt template, checking that all the
variables are present and that the right format is used.
override
-
withFallbacks(
List< Runnable< fallbacks) → RunnableWithFallback<InputValues, RunnableOptions, PromptValue> >InputValues, PromptValue> -
Adds fallback runnables to be invoked if the primary runnable fails.
inherited
-
withRetry(
{int maxRetries = 3, FutureOr< bool> retryIf(Object e)?, List<Duration?> ? delayDurations, bool addJitter = false}) → RunnableRetry<InputValues, PromptValue> -
Adds retry logic to an existing runnable.
inherited
Operators
-
operator ==(
covariant ChatPromptTemplate other) → bool -
The equality operator.
override
Static Methods
-
fromTemplateFile(
String templateFile, {ChatMessageType type = ChatMessageType.human, String? customRole, PartialValues? partialVariables, bool validateTemplate = true}) → Future< ChatPromptTemplate> - Creates a ChatPromptTemplate with a single message from a file.