Skip to content

Support specifying batch_size when submitting batch predict jobs #525

@steve-marmalade

Description

@steve-marmalade

The BatchPredictionJob REST API supports configuring the number of records sent to the machine replica in each batch via ManualBatchTuningParameters.

While ManualBatchTuningParameters is implemented in this library, I don't see any way to specify it when submitting a batch predict job.

Describe the solution you'd like

Add batch_size kwarg to models.Model.batch_predict and jobs.BatchPredictionJob.create and add it to the gapic_batch_prediction_job instance.

Metadata

Metadata

Assignees

Labels

api: vertex-aiIssues related to the googleapis/python-aiplatform API.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.type: feature request‘Nice-to-have’ improvement, new feature or different behavior or design.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions