Support specifying batch_size
when submitting batch predict jobs
#525
Labels
api: vertex-ai
Issues related to the googleapis/python-aiplatform API.
priority: p1
Important issue which blocks shipping the next release. Will be fixed prior to next release.
type: feature request
‘Nice-to-have’ improvement, new feature or different behavior or design.
The BatchPredictionJob REST API supports configuring the number of records sent to the machine replica in each batch via ManualBatchTuningParameters.
While ManualBatchTuningParameters is implemented in this library, I don't see any way to specify it when submitting a batch predict job.
Describe the solution you'd like
Add
batch_size
kwarg to models.Model.batch_predict and jobs.BatchPredictionJob.create and add it to thegapic_batch_prediction_job
instance.The text was updated successfully, but these errors were encountered: