Skip to content

Commit

Permalink
Copybara import of the project:
Browse files Browse the repository at this point in the history
--
7de2cb2 by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

feat: add encryption_spec to index.proto and index_endpoint.proto
feat: add TrialContext to study.proto
feat: add contexts to SuggestTrialsRequest in vizier_service.proto

PiperOrigin-RevId: 563249077

Source-Link: googleapis/googleapis@0e828f8

Source-Link: googleapis/googleapis-gen@e3d5760
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiZTNkNTc2MDRlOTU5ZDRkNjVjMWNlMGM2MTk2YTZiNjY5MzE2MmM5NiJ9

--
86c7445 by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

--
aebcb7e by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

feat: add encryption_spec to index.proto and index_endpoint.proto
feat: add TrialContext to study.proto
feat: add contexts to SuggestTrialsRequest in vizier_service.proto

PiperOrigin-RevId: 563257278

Source-Link: googleapis/googleapis@7534629

Source-Link: googleapis/googleapis-gen@63658af
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiNjM2NThhZjZjMTY2NGFjZmFiZGI0ZjU1N2M3NDk3MTAxODk1NDkyYSJ9

--
22597ee by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

--
ae5040f by Yu-Han Liu <[email protected]>:

add .kokoro/samples and noxfile.py to owlbot exclude

--
b8c0a36 by Yu-Han Liu <[email protected]>:

add testing to owlbot exclude

--
a7aff2c by Yu-Han Liu <[email protected]>:

revert changes that should be excluded by owlbot

--
7a55a96 by Yu-Han Liu <[email protected]>:

add .pre-commit-config.yaml and docs/conf.py to owlbot excludes

--
0eb6287 by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

docs: Minor formatting
chore: Update gapic-generator-python to v1.11.5
build: Update rules_python to 0.24.0

PiperOrigin-RevId: 563436317

Source-Link: googleapis/googleapis@42fd37b

Source-Link: googleapis/googleapis-gen@280264c
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiMjgwMjY0Y2EwMmZiOTMxNmI0MjM3YTk2ZDBhZjFhMjM0M2E4MWE1NiJ9

--
21f7775 by Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>:

🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

--
19d7edc by Yu-Han Liu <[email protected]>:

revert exluded files

--
5812314 by Yu-Han Liu <[email protected]>:

exclude templated files

COPYBARA_INTEGRATE_REVIEW=#2558 from googleapis:owl-bot-copy f0a6dc6
PiperOrigin-RevId: 563814466
  • Loading branch information
gcf-owl-bot[bot] authored and copybara-github committed Sep 8, 2023
1 parent d516931 commit d76bceb
Show file tree
Hide file tree
Showing 282 changed files with 582 additions and 414 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ class AutoMlTablesInputs(proto.Message):
The supported optimization objectives depend on
the prediction type. If the field is not set, a
default objective function is used.
classification (binary):
"maximize-au-roc" (default) - Maximize the
Expand All @@ -122,10 +123,12 @@ class AutoMlTablesInputs(proto.Message):
recall value. "maximize-recall-at-precision" -
Maximize recall for a specified
precision value.
classification (multi-class):
"minimize-log-loss" (default) - Minimize log
loss.
regression:
"minimize-rmse" (default) - Minimize
Expand All @@ -137,13 +140,15 @@ class AutoMlTablesInputs(proto.Message):
Required. The train budget of creating this
model, expressed in milli node hours i.e. 1,000
value in this field means 1 node hour.
The training cost of the model will not exceed
this budget. The final cost will be attempted to
be close to the budget, though may end up being
(even) noticeably smaller - at the backend's
discretion. This especially may happen when
further model training ceases to provide any
improvements.
If the budget is set to a value known to be
insufficient to train a model for the given
dataset, the training won't be attempted and
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ class AutoMlTablesInputs(proto.Message):
The supported optimization objectives depend on
the prediction type. If the field is not set, a
default objective function is used.
classification (binary):
"maximize-au-roc" (default) - Maximize the
Expand All @@ -122,10 +123,12 @@ class AutoMlTablesInputs(proto.Message):
recall value. "maximize-recall-at-precision" -
Maximize recall for a specified
precision value.
classification (multi-class):
"minimize-log-loss" (default) - Minimize log
loss.
regression:
"minimize-rmse" (default) - Minimize
Expand All @@ -137,13 +140,15 @@ class AutoMlTablesInputs(proto.Message):
Required. The train budget of creating this
model, expressed in milli node hours i.e. 1,000
value in this field means 1 node hour.
The training cost of the model will not exceed
this budget. The final cost will be attempted to
be close to the budget, though may end up being
(even) noticeably smaller - at the backend's
discretion. This especially may happen when
further model training ceases to provide any
improvements.
If the budget is set to a value known to be
insufficient to train a model for the given
dataset, the training won't be attempted and
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -105,13 +105,15 @@ class AutoMlForecastingInputs(proto.Message):
Required. The train budget of creating this
model, expressed in milli node hours i.e. 1,000
value in this field means 1 node hour.
The training cost of the model will not exceed
this budget. The final cost will be attempted to
be close to the budget, though may end up being
(even) noticeably smaller - at the backend's
discretion. This especially may happen when
further model training ceases to provide any
improvements.
If the budget is set to a value known to be
insufficient to train a model for the given
dataset, the training won't be attempted and
Expand Down
2 changes: 2 additions & 0 deletions google/cloud/aiplatform_v1/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -494,6 +494,7 @@
from .types.study import Study
from .types.study import StudySpec
from .types.study import Trial
from .types.study import TrialContext
from .types.tensorboard import Tensorboard
from .types.tensorboard_data import Scalar
from .types.tensorboard_data import TensorboardBlob
Expand Down Expand Up @@ -1095,6 +1096,7 @@
"TrainingConfig",
"TrainingPipeline",
"Trial",
"TrialContext",
"UndeployIndexOperationMetadata",
"UndeployIndexRequest",
"UndeployIndexResponse",
Expand Down
84 changes: 0 additions & 84 deletions google/cloud/aiplatform_v1/gapic_metadata.json
Original file line number Diff line number Diff line change
Expand Up @@ -1925,90 +1925,6 @@
}
}
},
"ScheduleService": {
"clients": {
"grpc": {
"libraryClient": "ScheduleServiceClient",
"rpcs": {
"CreateSchedule": {
"methods": [
"create_schedule"
]
},
"DeleteSchedule": {
"methods": [
"delete_schedule"
]
},
"GetSchedule": {
"methods": [
"get_schedule"
]
},
"ListSchedules": {
"methods": [
"list_schedules"
]
},
"PauseSchedule": {
"methods": [
"pause_schedule"
]
},
"ResumeSchedule": {
"methods": [
"resume_schedule"
]
},
"UpdateSchedule": {
"methods": [
"update_schedule"
]
}
}
},
"grpc-async": {
"libraryClient": "ScheduleServiceAsyncClient",
"rpcs": {
"CreateSchedule": {
"methods": [
"create_schedule"
]
},
"DeleteSchedule": {
"methods": [
"delete_schedule"
]
},
"GetSchedule": {
"methods": [
"get_schedule"
]
},
"ListSchedules": {
"methods": [
"list_schedules"
]
},
"PauseSchedule": {
"methods": [
"pause_schedule"
]
},
"ResumeSchedule": {
"methods": [
"resume_schedule"
]
},
"UpdateSchedule": {
"methods": [
"update_schedule"
]
}
}
}
}
},
"SpecialistPoolService": {
"clients": {
"grpc": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from google.protobuf import empty_pb2 # type: ignore
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import struct_pb2 # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from google.protobuf import empty_pb2 # type: ignore
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import struct_pb2 # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore

DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import DatasetServiceTransport, DEFAULT_CLIENT_INFO

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import DatasetServiceTransport, DEFAULT_CLIENT_INFO
from .grpc import DatasetServiceGrpcTransport
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from google.protobuf import empty_pb2 # type: ignore
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import timestamp_pb2 # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from google.protobuf import empty_pb2 # type: ignore
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import timestamp_pb2 # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore

DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import EndpointServiceTransport, DEFAULT_CLIENT_INFO

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,6 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import EndpointServiceTransport, DEFAULT_CLIENT_INFO
from .grpc import EndpointServiceGrpcTransport
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .transports.base import (
FeaturestoreOnlineServingServiceTransport,
DEFAULT_CLIENT_INFO,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .transports.base import (
FeaturestoreOnlineServingServiceTransport,
DEFAULT_CLIENT_INFO,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore

DEFAULT_CLIENT_INFO = gapic_v1.client_info.ClientInfo(
gapic_version=package_version.__version__
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import FeaturestoreOnlineServingServiceTransport, DEFAULT_CLIENT_INFO


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from .base import FeaturestoreOnlineServingServiceTransport, DEFAULT_CLIENT_INFO
from .grpc import FeaturestoreOnlineServingServiceGrpcTransport

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
from google.cloud.location import locations_pb2 # type: ignore
from google.iam.v1 import iam_policy_pb2 # type: ignore
from google.iam.v1 import policy_pb2 # type: ignore
from google.longrunning import operations_pb2
from google.longrunning import operations_pb2 # type: ignore
from google.protobuf import empty_pb2 # type: ignore
from google.protobuf import field_mask_pb2 # type: ignore
from google.protobuf import timestamp_pb2 # type: ignore
Expand Down Expand Up @@ -2347,13 +2347,16 @@ async def import_feature_values(
operation. The imported features are guaranteed to be
visible to subsequent read operations after the
operation is marked as successfully done.
If an import operation fails, the Feature values
returned from reads and exports may be inconsistent. If
consistency is required, the caller must retry the same
import request again and wait till the new operation
returned is marked as successfully done.
There are also scenarios where the caller can cause
inconsistency.
- Source data for import contains multiple distinct
Feature values for the same entity ID and
timestamp.
Expand Down Expand Up @@ -2494,6 +2497,7 @@ async def batch_read_feature_values(
metadata: Sequence[Tuple[str, str]] = (),
) -> operation_async.AsyncOperation:
r"""Batch reads Feature values from a Featurestore.
This API enables batch reading Feature values, where
each read instance in the batch may read Feature values
of entities from one or more EntityTypes. Point-in-time
Expand Down Expand Up @@ -2763,10 +2767,12 @@ async def delete_feature_values(
metadata: Sequence[Tuple[str, str]] = (),
) -> operation_async.AsyncOperation:
r"""Delete Feature values from Featurestore.
The progress of the deletion is tracked by the returned
operation. The deleted feature values are guaranteed to
be invisible to subsequent read operations after the
operation is marked as successfully done.
If a delete feature values operation fails, the feature
values returned from reads and exports may be
inconsistent. If consistency is required, the caller
Expand Down
Loading

0 comments on commit d76bceb

Please sign in to comment.