OllamaEmbeddings class
Wrapper around Ollama Embeddings API.
Ollama allows you to run open-source large language models, such as Llama 3, locally.
For a complete list of supported models and model variants, see the Ollama model library.
Example:
final embeddings = OllamaEmbeddings(model: 'llama3.2');
final res = await embeddings.embedQuery('Hello world');
Setup
- Download and install Ollama
- Fetch a model via
ollama pull <model family>
- e.g., for
Llama-7b
:ollama pull llama3.2
Advance
Custom HTTP client
You can always provide your own implementation of http.Client
for further
customization:
final client = Ollama(
client: MyHttpClient(),
);
Using a proxy
HTTP proxy
You can use your own HTTP proxy by overriding the baseUrl
and providing
your required headers
:
final client = Ollama(
baseUrl: 'https://my-proxy.com',
headers: {'x-my-proxy-header': 'value'},
queryParams: {'x-my-proxy-query-param': 'value'},
);
If you need further customization, you can always provide your own
http.Client
.
SOCKS5 proxy
To use a SOCKS5 proxy, you can use the
socks5_proxy
package and a
custom http.Client
.
Constructors
-
OllamaEmbeddings({String model = 'llama3.2', int? keepAlive, String baseUrl = 'http://localhost:11434/api', Map<
String, String> ? headers, Map<String, dynamic> ? queryParams, Client? client}) - Create a new OllamaEmbeddings instance.
Properties
Methods
-
close(
) → void - Closes the client and cleans up any resources associated with it.
-
embedDocuments(
List< Document> documents) → Future<List< List< >double> > - Embed search docs.
-
embedQuery(
String query) → Future< List< double> > - Embed query text.
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited