createBatchInferenceJob method
- required BatchInferenceJobInput jobInput,
- required String jobName,
- required BatchInferenceJobOutput jobOutput,
- required String roleArn,
- required String solutionVersionArn,
- BatchInferenceJobConfig? batchInferenceJobConfig,
- String? filterArn,
- int? numResults,
Creates a batch inference job. The operation can handle up to 50 million records and the input file must be in JSON format. For more information, see recommendations-batch.
May throw InvalidInputException. May throw ResourceAlreadyExistsException. May throw LimitExceededException. May throw ResourceNotFoundException. May throw ResourceInUseException.
Parameter jobInput
:
The Amazon S3 path that leads to the input file to base your
recommendations on. The input material must be in JSON format.
Parameter jobName
:
The name of the batch inference job to create.
Parameter jobOutput
:
The path to the Amazon S3 bucket where the job's output will be stored.
Parameter roleArn
:
The ARN of the Amazon Identity and Access Management role that has
permissions to read and write to your input and out Amazon S3 buckets
respectively.
Parameter solutionVersionArn
:
The Amazon Resource Name (ARN) of the solution version that will be used
to generate the batch inference recommendations.
Parameter batchInferenceJobConfig
:
The configuration details of a batch inference job.
Parameter filterArn
:
The ARN of the filter to apply to the batch inference job. For more
information on using filters, see Using Filters with Amazon Personalize.
Parameter numResults
:
The number of recommendations to retreive.
Implementation
Future<CreateBatchInferenceJobResponse> createBatchInferenceJob({
required BatchInferenceJobInput jobInput,
required String jobName,
required BatchInferenceJobOutput jobOutput,
required String roleArn,
required String solutionVersionArn,
BatchInferenceJobConfig? batchInferenceJobConfig,
String? filterArn,
int? numResults,
}) async {
ArgumentError.checkNotNull(jobInput, 'jobInput');
ArgumentError.checkNotNull(jobName, 'jobName');
_s.validateStringLength(
'jobName',
jobName,
1,
63,
isRequired: true,
);
ArgumentError.checkNotNull(jobOutput, 'jobOutput');
ArgumentError.checkNotNull(roleArn, 'roleArn');
_s.validateStringLength(
'roleArn',
roleArn,
0,
256,
isRequired: true,
);
ArgumentError.checkNotNull(solutionVersionArn, 'solutionVersionArn');
_s.validateStringLength(
'solutionVersionArn',
solutionVersionArn,
0,
256,
isRequired: true,
);
_s.validateStringLength(
'filterArn',
filterArn,
0,
256,
);
final headers = <String, String>{
'Content-Type': 'application/x-amz-json-1.1',
'X-Amz-Target': 'AmazonPersonalize.CreateBatchInferenceJob'
};
final jsonResponse = await _protocol.send(
method: 'POST',
requestUri: '/',
exceptionFnMap: _exceptionFns,
// TODO queryParams
headers: headers,
payload: {
'jobInput': jobInput,
'jobName': jobName,
'jobOutput': jobOutput,
'roleArn': roleArn,
'solutionVersionArn': solutionVersionArn,
if (batchInferenceJobConfig != null)
'batchInferenceJobConfig': batchInferenceJobConfig,
if (filterArn != null) 'filterArn': filterArn,
if (numResults != null) 'numResults': numResults,
},
);
return CreateBatchInferenceJobResponse.fromJson(jsonResponse.body);
}