ollama_dart library

Dart Client for the Ollama API (run Llama 3, Code Llama, and other models locally).

Classes

CopyModelRequest
Request class for copying a model.
CreateModelRequest
Create model request object.
CreateModelResponse
Response object for creating a model. When finished, status is success.
DeleteModelRequest
Request class for deleting a model.
GenerateChatCompletionRequest
Request class for the chat endpoint.
GenerateChatCompletionResponse
The response class for the chat endpoint.
GenerateCompletionRequest
Request class for the generate endpoint.
GenerateCompletionResponse
The response class for the generate endpoint.
GenerateEmbeddingRequest
Generate embeddings from a model.
GenerateEmbeddingResponse
Returns the embedding information.
Message
A message in the chat endpoint
Model
A model available locally.
ModelDetails
Details about a model.
ModelInfo
Details about a model including modelfile, template, parameters, license, and system prompt.
ModelInfoRequest
Request class for the show model info endpoint.
ModelsResponse
Response class for the list models endpoint.
OllamaClient
Client for Ollama API.
PullModelRequest
Request class for pulling a model.
PullModelResponse
Response class for pulling a model.
PushModelRequest
Request class for pushing a model.
PushModelResponse
Response class for pushing a model.
RequestOptions
Additional model parameters listed in the documentation for the Modelfile such as temperature.

Enums

CreateModelStatus
Status creating the model
DoneReason
Reason why the model is done generating a response.
MessageRole
The role of the message
PullModelStatus
Status pulling the model.
PushModelStatus
Status pushing the model.
ResponseFormat
The format to return a response in. Currently the only accepted value is json.

Exceptions / Errors

OllamaClientException
HTTP exception handler for OllamaClient