Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: adds function/method enhancements, demo samples #122

Merged
merged 31 commits into from
Dec 17, 2020
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
c835503
feat: adds function/method enhancements
telpirion Dec 6, 2020
4e78f6f
fix: sample tests
telpirion Dec 6, 2020
fb81497
fix: sample tests
telpirion Dec 6, 2020
1bb3c51
fix: region tag (linter)
telpirion Dec 6, 2020
9d44556
fix: change to aiplatform.gapic
telpirion Dec 6, 2020
23c0dc9
fix: linter
telpirion Dec 6, 2020
e84c679
fix: linter
telpirion Dec 6, 2020
7a4cde5
fix: lint
telpirion Dec 6, 2020
ec89cd1
feat: add unit tests
telpirion Dec 7, 2020
7457ecd
fix: linter
telpirion Dec 7, 2020
53e89e9
fix: changed name
telpirion Dec 7, 2020
4cb779f
fix: docstring issues in generated files
telpirion Dec 7, 2020
5b162d0
fix: adds from_map test, small fix to generated enhanced type
telpirion Dec 8, 2020
25b8814
fix: lint
telpirion Dec 8, 2020
833c0fd
fix: docstring breaks build :S
telpirion Dec 9, 2020
ae5c058
fix: more docstrings
telpirion Dec 9, 2020
2d70605
fix: lint
telpirion Dec 14, 2020
45d5c7a
fix: lint;
telpirion Dec 15, 2020
fe05ee5
fix: blacken files
telpirion Dec 15, 2020
eb1bb2d
fix: per reviewer
telpirion Dec 15, 2020
3b20252
Merge branch 'master' into enhanced-lib2
telpirion Dec 15, 2020
bd0a9fd
fix: per reviewers
telpirion Dec 16, 2020
6053297
Merge branch 'master' into enhanced-lib2
telpirion Dec 16, 2020
390a58c
fix: reblacken
telpirion Dec 16, 2020
890e749
fix: per reviewer
telpirion Dec 16, 2020
821adb4
fix: per reviewer
telpirion Dec 16, 2020
667b49f
fix: per reviewer
telpirion Dec 16, 2020
de2c3dd
chore: added CODEOWNERS file to enhanced library tests
telpirion Dec 16, 2020
8832966
fix: lint
telpirion Dec 16, 2020
c4a469d
fix: per reviewer
telpirion Dec 16, 2020
2a269f8
fix: per reviewer
telpirion Dec 16, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.cloud.aiplatform_helpers import add_methods_to_classes_in_package
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would suggest using a namespace that does not imply public API, since we don't expect the users to use this, right?. perhaps something like _helpers instead of aiplatform_helpers?

Also I think the convention here is to import the module and not individual methods or classes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also I think the convention here is to import the module and not individual methods or classes.

That's what the public Google Python style guide says. https://google.github.io/styleguide/pyguide.html#22-imports I don't know if we've followed it consistently in the past, but probably best to adhere to this for new code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The value_converter.py module is intended to be public. It will be helpful for tabular developers who need to format their prediction instances, for example.

I'll change the name of the methods intended to be private so that they have a leading underscore in their names.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see - perhaps add_methods_to_classes_in_package should be in a private module, where as value_converter a public module. in that case perhaps a nested submodule aiplatform.helpers.value_converter would be preferred over aiplatform_helpers.value_converter.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

import google.cloud.aiplatform.v1beta1.schema.predict.instance_v1beta1.types as pkg

from google.cloud.aiplatform.v1beta1.schema.predict.instance_v1beta1.types.image_classification import (
ImageClassificationPredictionInstance,
Expand Down Expand Up @@ -54,3 +56,4 @@
"VideoClassificationPredictionInstance",
"VideoObjectTrackingPredictionInstance",
)
add_methods_to_classes_in_package(pkg)
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.cloud.aiplatform_helpers import add_methods_to_classes_in_package
import google.cloud.aiplatform.v1beta1.schema.predict.params_v1beta1.types as pkg

from google.cloud.aiplatform.v1beta1.schema.predict.params_v1beta1.types.image_classification import (
ImageClassificationPredictionParams,
Expand Down Expand Up @@ -42,3 +44,4 @@
"VideoClassificationPredictionParams",
"VideoObjectTrackingPredictionParams",
)
add_methods_to_classes_in_package(pkg)
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.cloud.aiplatform_helpers import add_methods_to_classes_in_package
import google.cloud.aiplatform.v1beta1.schema.predict.prediction_v1beta1.types as pkg

from google.cloud.aiplatform.v1beta1.schema.predict.prediction_v1beta1.types.classification import (
ClassificationPredictionResult,
Expand Down Expand Up @@ -62,3 +64,4 @@
"VideoClassificationPredictionResult",
"VideoObjectTrackingPredictionResult",
)
add_methods_to_classes_in_package(pkg)
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@

import proto # type: ignore


from google.cloud.aiplatform.v1beta1.schema.predict.instance import text_sentiment_pb2 as gcaspi_text_sentiment # type: ignore
# DO NOT OVERWRITE FOLLOWING LINE: it was manually edited.
from google.cloud.aiplatform.v1beta1.schema.predict.instance import TextSentimentPredictionInstance
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wouldn't this be overwritten by the next re-generation? Why is it necessary to change the import here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If this replace needs to be made permanent, please do it in the synth.py. example

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.



__protobuf__ = proto.module(
Expand Down Expand Up @@ -59,7 +59,7 @@ class Prediction(proto.Message):
instance = proto.Field(
proto.MESSAGE,
number=1,
message=gcaspi_text_sentiment.TextSentimentPredictionInstance,
message=TextSentimentPredictionInstance,
)

prediction = proto.Field(proto.MESSAGE, number=2, message=Prediction,)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.cloud.aiplatform_helpers import add_methods_to_classes_in_package
import google.cloud.aiplatform.v1beta1.schema.trainingjob.definition_v1beta1.types as pkg

from google.cloud.aiplatform.v1beta1.schema.trainingjob.definition_v1beta1.types.automl_forecasting import (
AutoMlForecasting,
Expand Down Expand Up @@ -130,3 +132,4 @@
"AutoMlVideoObjectTrackingInputs",
"ExportEvaluatedDataItemsConfig",
)
add_methods_to_classes_in_package(pkg)
Original file line number Diff line number Diff line change
Expand Up @@ -78,14 +78,14 @@ class AutoMlForecastingInputs(proto.Message):
function over the validation set.

The supported optimization objectives:
"minimize-rmse" (default) - Minimize root-
"minimize-rmse" (default) - Minimize root-
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PSA: If the proto comments are formatted correctly but the docstrings are getting generated in a weird state please file bugs on the generator repo. https://github.com/googleapis/gapic-generator-python

mean-squared error (RMSE). "minimize-mae" -
Minimize mean-absolute error (MAE). "minimize-
rmsle" - Minimize root-mean-squared log error
(RMSLE). "minimize-rmspe" - Minimize root-
mean-squared percentage error (RMSPE).
"minimize-wape-mae" - Minimize the combination
of weighted absolute percentage error (WAPE)
of weighted absolute percentage error (WAPE)
and mean-absolute-error (MAE).
train_budget_milli_node_hours (int):
Required. The train budget of creating this
Expand Down Expand Up @@ -418,11 +418,11 @@ class Period(proto.Message):
unit (str):
The time granularity unit of this time
period. The supported unit are:
"hour"
"day"
"week"
"month"
"year".
"hour"
"day"
"week"
"month"
"year".
quantity (int):
The number of units per period, e.g. 3 weeks
or 2 months.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ class AutoMlTablesInputs(proto.Message):
produce. "classification" - Predict one out of
multiple target values is
picked for each row.
"regression" - Predict a value based on its
"regression" - Predict a value based on its
relation to other values. This
type is available only to columns that contain
semantically numeric values, i.e. integers or
Expand All @@ -87,22 +87,22 @@ class AutoMlTablesInputs(proto.Message):
the prediction type. If the field is not set, a
default objective function is used.
classification (binary):
"maximize-au-roc" (default) - Maximize the
"maximize-au-roc" (default) - Maximize the
area under the receiver
operating characteristic (ROC) curve.
"minimize-log-loss" - Minimize log loss.
"maximize-au-prc" - Maximize the area under
"maximize-au-prc" - Maximize the area under
the precision-recall curve. "maximize-
precision-at-recall" - Maximize precision for a
specified
recall value. "maximize-recall-at-precision" -
Maximize recall for a specified
precision value.
classification (multi-class):
"minimize-log-loss" (default) - Minimize log
"minimize-log-loss" (default) - Minimize log
loss.
regression:
"minimize-rmse" (default) - Minimize root-
"minimize-rmse" (default) - Minimize root-
mean-squared error (RMSE). "minimize-mae" -
Minimize mean-absolute error (MAE). "minimize-
rmsle" - Minimize root-mean-squared log error
Expand Down
72 changes: 72 additions & 0 deletions google/cloud/aiplatform_helpers/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# -*- coding: utf-8 -*-

# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from google.cloud.aiplatform_helpers.value_converter import to_value
from google.cloud.aiplatform_helpers.value_converter import from_value
from google.cloud.aiplatform_helpers.value_converter import from_map

from proto.marshal import Marshal
from proto.marshal.rules.struct import ValueRule
from google.protobuf.struct_pb2 import Value


class ConversionValueRule(ValueRule):
def to_python(self, value, *, absent: bool = None):
return super().to_python(value, absent=absent)

def to_proto(self, value):

# Need to check whether value is an instance
# of an enhanced type
if callable(getattr(value, 'to_value', None)):
return value.to_value()
else:
return super().to_proto(value)


def add_methods_to_classes_in_package(pkg):
classes = dict([(name, cls)
for name, cls in pkg.__dict__.items()
if isinstance(cls, type)])

for class_name, cls in classes.items():
# Add to_value() method to class with docstring
setattr(cls, 'to_value', to_value)
cls.to_value.__doc__ = to_value.__doc__

# Add from_value() method to class with docstring
cls.from_value = add_from_value_to_class(cls)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this one not calling setattr like the other two methods?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed.

I was just trying different methods of assigning members dynamically; forgot to standardize on one technique.

cls.from_value.__doc__ = from_value.__doc__

# Add from_map() method to class with docstring
setattr(cls, 'from_map', add_from_map_to_class(cls))
cls.from_map.__doc__ = from_map.__doc__


def add_from_value_to_class(cls):
def _from_value(value):
return from_value(cls, value)
return _from_value


def add_from_map_to_class(cls):
def _from_map(map_):
return from_map(cls, map_)
return _from_map


marshal = Marshal(name='google.cloud.aiplatform.v1beta1')
marshal.register(Value, ConversionValueRule(marshal=marshal))
74 changes: 74 additions & 0 deletions google/cloud/aiplatform_helpers/value_converter.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from google.protobuf.struct_pb2 import Value
from proto.marshal.collections.maps import MapComposite
from proto.marshal import Marshal
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
from proto import Message
from proto.message import MessageMeta


def to_value(self: Message) -> Value:
"""Converts a message type to a :class:`~google.protobuf.struct_pb2.Value` object.

Args:
message: the message to convert

Returns:
the message as a :class:`~google.protobuf.struct_pb2.Value` object
"""
def is_prop(prop):
if prop[0].isupper():
return False
if prop.startswith('_'):
return False
return True

props = list(filter(is_prop, dir(self._pb)))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the intention here is to collect all the field names - is there a better to do that than relying on attribute name's first character?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@software-dov Do you have any suggestsions?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that was my hack for trying to get around the "int64s as strings" issue.

However, playing with the Java library the other day, I think that sending int64 values as strings might not be as big a deal as I originally though. I'm going to switch this to a simple call to json_format.ParseDict() and make sure that I can still train a model.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This apparently isn't needed! I've removed this code.


props_dict = {}
for prop in props:
props_dict[prop] = getattr(self._pb, prop)

return json_format.ParseDict(props_dict, Value())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this work if some of the values of props_dict are nested proto messages?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed this bit.



def from_value(cls: MessageMeta, value: Value) -> Message:
"""Creates instance of class from a :class:`~google.protobuf.struct_pb2.Value` object.

Args:
value: a :class:`~google.protobuf.struct_pb2.Value` object

Returns:
Instance of class
"""
value_dict = json_format.MessageToDict(value)
return json_format.ParseDict(value_dict, cls()._pb)


def from_map(cls: MessageMeta, map_: MapComposite) -> Message:
"""Creates instance of class from a :class:`~proto.marshal.collections.maps.MapComposite` object.

Args:
map_: a :class:`~proto.marshal.collections.maps.MapComposite` object

Returns:
Instance of class
"""
map_dict = dict(map_)
marshal = Marshal(name='marshal')
pb = marshal.to_proto(Value, map_)
return from_value(cls, pb)
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@

# [START aiplatform_create_training_pipeline_image_classification_sample]
from google.cloud import aiplatform
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
from google.cloud.aiplatform.v1beta1.schema.trainingjob import definition
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it possible to remove v1beta by exporting schema under cloud.aiplatform? (without moving all the files)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not very easily, no.

ModelType = definition.AutoMlImageClassificationInputs().ModelType
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we don't have to create an instance of AutoMlImageClassificationInputs here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed.



def create_training_pipeline_image_classification_sample(
Expand All @@ -30,18 +30,18 @@ def create_training_pipeline_image_classification_sample(
# Initialize client that will be used to create and send requests.
# This client only needs to be created once, and can be reused for multiple requests.
client = aiplatform.gapic.PipelineServiceClient(client_options=client_options)
training_task_inputs_dict = {
"multiLabel": True,
"modelType": "CLOUD",
"budgetMilliNodeHours": 8000,
"disableEarlyStopping": False,
}
training_task_inputs = json_format.ParseDict(training_task_inputs_dict, Value())

icn_training_inputs = definition.AutoMlImageClassificationInputs(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that here we introduce an inconsistent style of using instance of a particular python class, whereas the rest of the sample is in python dicts. I think technically we could keep using python dict here too (with only the change of camelCase to snake_case for dict keys).

@leahecole let us know if this is fine according to sample style guidelines. Several more samples will be updated and follow the same pattern as the two samples of this PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instances are nicer b/c IDEs can offer more assistance with field names and types. Dicts are sometimes easier to pass around though.

I don't think we currently mandate one style or the other in the style guide. There is a mix in the currently published samples.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

General consensus amongst the owners was preference to have generated classes for API resources - having spell check and autocomplete as well as knowing where to look in reference docs is helpful

If you're using a user-defined object with arbitrary properties, a dict may be simpler.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This may be a place where we need to avoid imposing consistency across the board. For example explicitly construct instances can make certain samples (e.g. https://github.com/googleapis/python-aiplatform/blob/master/samples/snippets/create_hyperparameter_tuning_job_python_package_sample.py) much more difficult to read than dicts, and in some cases we are forced to use instances (e.g. https://github.com/googleapis/python-aiplatform/blob/master/samples/snippets/upload_model_explain_image_managed_container_sample.py) because of dependency issues.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to clarify: I'm keeping this sample as-is?

multi_label=True,
model_type=ModelType.CLOUD,
budget_milli_node_hours=8000,
disable_early_stopping=False
)

training_pipeline = {
"display_name": display_name,
"training_task_definition": "gs://google-cloud-aiplatform/schema/trainingjob/definition/automl_image_classification_1.0.0.yaml",
"training_task_inputs": training_task_inputs,
"training_task_inputs": icn_training_inputs.to_value(),
Copy link
Contributor

@dizcology dizcology Dec 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I rather prefer not to have additional method calls here.

(that is: define a new variable above so that the value of "trainign_task_inputs" is just that variable)

Copy link
Contributor

@dizcology dizcology Dec 11, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also note that this is simply a style preference with some hidden implication on sample generation. please feel free to leave it as is for sample review.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed.

"input_data_config": {"dataset_id": dataset_id},
"model_to_upload": {"display_name": model_display_name},
}
Expand Down
27 changes: 16 additions & 11 deletions samples/snippets/predict_image_classification_sample.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,9 @@
import base64

from google.cloud import aiplatform
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
from google.cloud.aiplatform.v1beta1.schema.predict import instance
from google.cloud.aiplatform.v1beta1.schema.predict import params
from google.cloud.aiplatform.v1beta1.schema.predict import prediction


def predict_image_classification_sample(
Expand All @@ -36,25 +37,29 @@ def predict_image_classification_sample(

# The format of each instance should conform to the deployed model's prediction input schema.
encoded_content = base64.b64encode(file_content).decode("utf-8")
instance_dict = {"content": encoded_content}

instance = json_format.ParseDict(instance_dict, Value())
instances = [instance]
# See gs://google-cloud-aiplatform/schema/predict/params/image_classification_1.0.0.yaml for the format of the parameters.
parameters_dict = {"confidence_threshold": 0.5, "max_predictions": 5}
parameters = json_format.ParseDict(parameters_dict, Value())
instance_obj = instance.ImageClassificationPredictionInstance({
"content": encoded_content})

instance_val = instance_obj.to_value()
instances = [instance_val]

params_obj = params.ImageClassificationPredictionParams({
"confidence_threshold": 0.5, "max_predictions": 5})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it seems more common to pass these in as parameters as opposed to a dict, as is done in the other sample of this PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.


endpoint = client.endpoint_path(
project=project, location=location, endpoint=endpoint_id
)
response = client.predict(
endpoint=endpoint, instances=instances, parameters=parameters
endpoint=endpoint, instances=instances, parameters=params_obj
)
print("response")
print(" deployed_model_id:", response.deployed_model_id)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit - Is there a reason for the extra space at the beginning of this print statement?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a \t character in there earlier, but it got dropped accidentally. Added it back.

# See gs://google-cloud-aiplatform/schema/predict/prediction/classification.yaml for the format of the predictions.
predictions = response.predictions
for prediction in predictions:
print(" prediction:", dict(prediction))
for prediction_ in predictions:
prediction_obj = prediction.ClassificationPredictionResult.from_map(prediction_)
print(prediction_obj)


# [END aiplatform_predict_image_classification_sample]
Original file line number Diff line number Diff line change
Expand Up @@ -31,4 +31,4 @@ def test_ucaip_generated_predict_image_classification_sample(capsys):
)

out, _ = capsys.readouterr()
assert 'string_value: "daisy"' in out
assert 'deployed_model_id:' in out
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why did this test case change? Is there any chance this could lead to a false positive if no model ID is returned? Or will the sample straight up fail before it gets to this print statement?

Copy link
Contributor Author

@telpirion telpirion Dec 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A couple of reasons, but biggest of them: we want to avoid testing for the output of models, since retraining can cause the predictions to change.

No, a model ID must be returned as part of the online prediction--you can't have a prediction without a model! The sample will fail if you attempt to send a prediction request to an endpoint that has no model deployed to it.

Loading