sendMessageStream method
Continues the chat with a new message
.
Sends message
to the model as a continuation of the chat history and
reads the response in a stream.
Prepends the history to the request and uses the provided model to
generate new content.
When there are no candidates in any response in the stream, the message
and responses are ignored and will not be recorded in the history.
Waits for any ongoing or pending requests to sendMessage or sendMessageStream to complete before generating new content. Successful messages and responses for ongoing or pending requests will be reflected in the history sent for this message.
Waits to read the entire streamed response before recording the message and response and allowing pending messages to be sent.
Implementation
Stream<GenerateContentResponse> sendMessageStream(Content message) {
final controller = StreamController<GenerateContentResponse>(sync: true);
_mutex.acquire().then((lock) async {
try {
final responses = _generateContentStream(_history.followedBy([message]),
safetySettings: _safetySettings,
generationConfig: _generationConfig);
final content = <Content>[];
await for (final response in responses) {
if (response.candidates case [final candidate, ...]) {
content.add(candidate.content);
}
controller.add(response);
}
if (content.isNotEmpty) {
_history.add(message);
_history.add(_aggregate(content));
}
} catch (e, s) {
controller.addError(e, s);
}
lock.release();
unawaited(controller.close());
});
return controller.stream;
}