Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Client libraries for the Dataform API #1221

Merged
merged 28 commits into from
Jun 2, 2022
Merged
Show file tree
Hide file tree
Changes from 27 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
8a3f5c1
chore(ruby): Initial generation of google-iam-v1
gcf-owl-bot[bot] May 11, 2022
4adca58
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 11, 2022
922ce56
feat: add display_name and metadata to ModelEvaluation in aiplatform …
gcf-owl-bot[bot] May 12, 2022
515588b
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 12, 2022
d7081aa
feat: refreshes Bigtable Admin API(s) protos
gcf-owl-bot[bot] May 16, 2022
4677e77
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 16, 2022
3b6d152
Synchronize new proto/yaml changes.
gcf-owl-bot[bot] May 16, 2022
9cfce6a
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 16, 2022
c1dd35d
docs: fix docstring formatting
gcf-owl-bot[bot] May 18, 2022
49846e4
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 18, 2022
196ce4d
feat: add Examples to Explanation related messages in aiplatform v1be…
gcf-owl-bot[bot] May 19, 2022
52a891c
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 19, 2022
bb4211b
feat: update protos to include InvalidateApprovalRequest and GetAcces…
gcf-owl-bot[bot] May 19, 2022
1c67b7f
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 19, 2022
93fd7d3
chore: remove unused imports
gcf-owl-bot[bot] May 23, 2022
306b82c
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 23, 2022
da3f34b
feat: add latent_space_source to ExplanationMetadata in aiplatform v1…
gcf-owl-bot[bot] May 24, 2022
905fde5
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 24, 2022
3f5e881
feat: add failure_policy to PipelineJob in aiplatform v1 & v1beta1 pi…
gcf-owl-bot[bot] May 24, 2022
40deccc
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 24, 2022
659d399
feat: add IAM policy to aiplatform_v1beta1.yaml
gcf-owl-bot[bot] May 24, 2022
6c00143
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 24, 2022
1e87acf
chore: use gapic-generator-python 1.0.0
gcf-owl-bot[bot] May 27, 2022
b5b67ed
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 27, 2022
46ffd25
feat: Client libraries for the Dataform API
gcf-owl-bot[bot] May 30, 2022
e812666
🦉 Updates from OwlBot post-processor
gcf-owl-bot[bot] May 30, 2022
4b56808
Merge branch 'main' into owl-bot-copy
gcf-merge-on-green[bot] May 30, 2022
550977b
Merge branch 'main' into owl-bot-copy
dizcology Jun 1, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions docs/definition_v1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1 Schema Trainingjob Definition v1 API
=========================================================================

.. automodule:: google.cloud.aiplatform.v1.schema.trainingjob.definition_v1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/definition_v1beta1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1beta1 Schema Trainingjob Definition v1beta1 API
===================================================================================

.. automodule:: google.cloud.aiplatform.v1beta1.schema.trainingjob.definition_v1beta1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/instance_v1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1 Schema Predict Instance v1 API
===================================================================

.. automodule:: google.cloud.aiplatform.v1.schema.predict.instance_v1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/instance_v1beta1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1beta1 Schema Predict Instance v1beta1 API
=============================================================================

.. automodule:: google.cloud.aiplatform.v1beta1.schema.predict.instance_v1beta1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/params_v1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1 Schema Predict Params v1 API
=================================================================

.. automodule:: google.cloud.aiplatform.v1.schema.predict.params_v1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/params_v1beta1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1beta1 Schema Predict Params v1beta1 API
===========================================================================

.. automodule:: google.cloud.aiplatform.v1beta1.schema.predict.params_v1beta1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/prediction_v1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1 Schema Predict Prediction v1 API
=====================================================================

.. automodule:: google.cloud.aiplatform.v1.schema.predict.prediction_v1.types
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/prediction_v1beta1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Types for Google Cloud Aiplatform V1beta1 Schema Predict Prediction v1beta1 API
===============================================================================

.. automodule:: google.cloud.aiplatform.v1beta1.schema.predict.prediction_v1beta1.types
:members:
:undoc-members:
:show-inheritance:
4 changes: 4 additions & 0 deletions google/cloud/aiplatform_v1/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,10 +362,12 @@
from .types.model_service import UploadModelResponse
from .types.operation import DeleteOperationMetadata
from .types.operation import GenericOperationMetadata
from .types.pipeline_failure_policy import PipelineFailurePolicy
from .types.pipeline_job import PipelineJob
from .types.pipeline_job import PipelineJobDetail
from .types.pipeline_job import PipelineTaskDetail
from .types.pipeline_job import PipelineTaskExecutorDetail
from .types.pipeline_job import PipelineTemplateMetadata
from .types.pipeline_service import CancelPipelineJobRequest
from .types.pipeline_service import CancelTrainingPipelineRequest
from .types.pipeline_service import CreatePipelineJobRequest
Expand Down Expand Up @@ -829,12 +831,14 @@
"NearestNeighborSearchOperationMetadata",
"NfsMount",
"PauseModelDeploymentMonitoringJobRequest",
"PipelineFailurePolicy",
"PipelineJob",
"PipelineJobDetail",
"PipelineServiceClient",
"PipelineState",
"PipelineTaskDetail",
"PipelineTaskExecutorDetail",
"PipelineTemplateMetadata",
"Port",
"PredefinedSplit",
"PredictRequest",
Expand Down
18 changes: 9 additions & 9 deletions google/cloud/aiplatform_v1/services/migration_service/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -192,40 +192,40 @@ def parse_annotated_dataset_path(path: str) -> Dict[str, str]:
@staticmethod
def dataset_path(
project: str,
location: str,
dataset: str,
) -> str:
"""Returns a fully-qualified dataset string."""
return "projects/{project}/datasets/{dataset}".format(
return "projects/{project}/locations/{location}/datasets/{dataset}".format(
project=project,
location=location,
dataset=dataset,
)

@staticmethod
def parse_dataset_path(path: str) -> Dict[str, str]:
"""Parses a dataset path into its component segments."""
m = re.match(r"^projects/(?P<project>.+?)/datasets/(?P<dataset>.+?)$", path)
m = re.match(
r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)/datasets/(?P<dataset>.+?)$",
path,
)
return m.groupdict() if m else {}

@staticmethod
def dataset_path(
project: str,
location: str,
dataset: str,
) -> str:
"""Returns a fully-qualified dataset string."""
return "projects/{project}/locations/{location}/datasets/{dataset}".format(
return "projects/{project}/datasets/{dataset}".format(
project=project,
location=location,
dataset=dataset,
)

@staticmethod
def parse_dataset_path(path: str) -> Dict[str, str]:
"""Parses a dataset path into its component segments."""
m = re.match(
r"^projects/(?P<project>.+?)/locations/(?P<location>.+?)/datasets/(?P<dataset>.+?)$",
path,
)
m = re.match(r"^projects/(?P<project>.+?)/datasets/(?P<dataset>.+?)$", path)
return m.groupdict() if m else {}

@staticmethod
Expand Down
3 changes: 3 additions & 0 deletions google/cloud/aiplatform_v1/types/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -423,6 +423,7 @@
PipelineJobDetail,
PipelineTaskDetail,
PipelineTaskExecutorDetail,
PipelineTemplateMetadata,
)
from .pipeline_service import (
CancelPipelineJobRequest,
Expand Down Expand Up @@ -883,10 +884,12 @@
"UploadModelResponse",
"DeleteOperationMetadata",
"GenericOperationMetadata",
"PipelineFailurePolicy",
"PipelineJob",
"PipelineJobDetail",
"PipelineTaskDetail",
"PipelineTaskExecutorDetail",
"PipelineTemplateMetadata",
"CancelPipelineJobRequest",
"CancelTrainingPipelineRequest",
"CreatePipelineJobRequest",
Expand Down
8 changes: 4 additions & 4 deletions google/cloud/aiplatform_v1/types/endpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -221,10 +221,10 @@ class DeployedModel(proto.Message):
This value should be 1-10 characters, and valid characters
are /[0-9]/.
model (str):
Required. The name of the Model that this is
the deployment of. Note that the Model may be in
a different location than the DeployedModel's
Endpoint.
Required. The resource name of the Model that
this is the deployment of. Note that the Model
may be in a different location than the
DeployedModel's Endpoint.
display_name (str):
The display name of the DeployedModel. If not provided upon
creation, the Model's display_name is used.
Expand Down
7 changes: 7 additions & 0 deletions google/cloud/aiplatform_v1/types/explanation_metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,9 @@ class ExplanationMetadata(proto.Message):
including the URI scheme, than the one given on input. The
output URI will point to a location where the user only has
a read access.
latent_space_source (str):
Name of the source to generate embeddings for
example based explanations.
"""

class InputMetadata(proto.Message):
Expand Down Expand Up @@ -457,6 +460,10 @@ class OutputMetadata(proto.Message):
proto.STRING,
number=3,
)
latent_space_source = proto.Field(
proto.STRING,
number=5,
)


__all__ = tuple(sorted(__protobuf__.manifest))
34 changes: 34 additions & 0 deletions google/cloud/aiplatform_v1/types/featurestore.py
Original file line number Diff line number Diff line change
Expand Up @@ -92,12 +92,46 @@ class OnlineServingConfig(proto.Message):
set to 0, the featurestore will not have an
online store and cannot be used for online
serving.
scaling (google.cloud.aiplatform_v1.types.Featurestore.OnlineServingConfig.Scaling):
Online serving scaling configuration. Only one of
``fixed_node_count`` and ``scaling`` can be set. Setting one
will reset the other.
"""

class Scaling(proto.Message):
r"""Online serving scaling configuration. If min_node_count and
max_node_count are set to the same value, the cluster will be
configured with the fixed number of node (no auto-scaling).

Attributes:
min_node_count (int):
Required. The minimum number of nodes to
scale down to. Must be greater than or equal to
1.
max_node_count (int):
The maximum number of nodes to scale up to. Must be greater
than min_node_count, and less than or equal to 10 times of
'min_node_count'.
"""

min_node_count = proto.Field(
proto.INT32,
number=1,
)
max_node_count = proto.Field(
proto.INT32,
number=2,
)

fixed_node_count = proto.Field(
proto.INT32,
number=2,
)
scaling = proto.Field(
proto.MESSAGE,
number=4,
message="Featurestore.OnlineServingConfig.Scaling",
)

name = proto.Field(
proto.STRING,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ class StreamingReadFeatureValuesRequest(proto.Message):

class FeatureValue(proto.Message):
r"""Value for a feature.
NEXT ID: 15
(-- NEXT ID: 15 --)

This message has `oneof`_ fields (mutually exclusive fields).
For each oneof, at most one member field can be set at the same time.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class ManualBatchTuningParameters(proto.Message):
value will result in a whole batch not fitting
in a machine's memory, and the whole operation
will fail.
The default value is 4.
The default value is 64.
"""

batch_size = proto.Field(
Expand Down
16 changes: 16 additions & 0 deletions google/cloud/aiplatform_v1/types/model_evaluation.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ class ModelEvaluation(proto.Message):
name (str):
Output only. The resource name of the
ModelEvaluation.
display_name (str):
The display name of the ModelEvaluation.
metrics_schema_uri (str):
Points to a YAML file stored on Google Cloud Storage
describing the
Expand Down Expand Up @@ -92,6 +94,11 @@ class ModelEvaluation(proto.Message):
[ExplanationSpec][google.cloud.aiplatform.v1.ExplanationSpec]
that are used for explaining the predicted values on the
evaluated data.
metadata (google.protobuf.struct_pb2.Value):
The metadata of the ModelEvaluation. For the ModelEvaluation
uploaded from Managed Pipeline, metadata contains a
structured value with keys of "pipeline_job_id",
"evaluation_dataset_type", "evaluation_dataset_path".
"""

class ModelEvaluationExplanationSpec(proto.Message):
Expand Down Expand Up @@ -123,6 +130,10 @@ class ModelEvaluationExplanationSpec(proto.Message):
proto.STRING,
number=1,
)
display_name = proto.Field(
proto.STRING,
number=10,
)
metrics_schema_uri = proto.Field(
proto.STRING,
number=2,
Expand Down Expand Up @@ -159,6 +170,11 @@ class ModelEvaluationExplanationSpec(proto.Message):
number=9,
message=ModelEvaluationExplanationSpec,
)
metadata = proto.Field(
proto.MESSAGE,
number=11,
message=struct_pb2.Value,
)


__all__ = tuple(sorted(__protobuf__.manifest))
2 changes: 2 additions & 0 deletions google/cloud/aiplatform_v1/types/model_monitoring.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,8 @@ class TrainingDataset(proto.Message):

"csv"
The source file is a CSV file.
"jsonl"
The source file is a JSONL file.
target_field (str):
The target field name the model is to
predict. This field will be excluded when doing
Expand Down
41 changes: 41 additions & 0 deletions google/cloud/aiplatform_v1/types/pipeline_failure_policy.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
# -*- coding: utf-8 -*-
# Copyright 2022 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import proto # type: ignore


__protobuf__ = proto.module(
package="google.cloud.aiplatform.v1",
manifest={
"PipelineFailurePolicy",
},
)


class PipelineFailurePolicy(proto.Enum):
r"""Represents the failure policy of a pipeline. Currently, the default
of a pipeline is that the pipeline will continue to run until no
more tasks can be executed, also known as
PIPELINE_FAILURE_POLICY_FAIL_SLOW. However, if a pipeline is set to
PIPELINE_FAILURE_POLICY_FAIL_FAST, it will stop scheduling any new
tasks when a task has failed. Any scheduled tasks will continue to
completion.
"""
PIPELINE_FAILURE_POLICY_UNSPECIFIED = 0
PIPELINE_FAILURE_POLICY_FAIL_SLOW = 1
PIPELINE_FAILURE_POLICY_FAIL_FAST = 2


__all__ = tuple(sorted(__protobuf__.manifest))
Loading