Skip to content

api call error : input error for model using tensorflow custom estimators  #67

@ita9naiwa

Description

@ita9naiwa

Hi. I'm new to AWS sagemaker and build my custom tensorflow estimator with your tensorflow iris sample code.

I created own estimator, like this.

  if mode == tf.estimator.ModeKeys.PREDICT:
        export_outputs = {
            "recommend": tf.estimator.export.PredictOutput(predictions),
            tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY :
            tf.estimator.export.PredictOutput(predictions),
        }
        return tf.estimator.EstimatorSpec(mode,predictions=predictions,
                                         export_outputs = export_outputs)

(without export_outputs, classifier.export_savedmodel cannot export saved model)

I exported trained model using this

INPUT_TENSOR_NAME = 'items'
def serving_input_fn():
    feature_spec = {INPUT_TENSOR_NAME : tf.FixedLenFeature(dtype=tf.int64, shape=[100])}
    return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()
exported_model = classifier.export_savedmodel(export_dir_base = 'export/Servo/', 
                               serving_input_receiver_fn = serving_input_fn)

Then I saved my model, created checkpoint and send query to it.

sample = np.arange(100).astype(np.int64).tolist() predictor.predict(sample)

I got error follows.

Error on Jupyter Notebook Console:
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (500) from model with message "". See https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer:group=/aws/sagemaker/Endpoints/sagemaker-tensorflow-py2-cpu-2018-02-01-17-06-45-306 in account 561830960602 for more information.

Error found on CloudWatch Management Console

[2018-02-01 17:21:08,384] ERROR in serving: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
Traceback (most recent call last):
File "/opt/amazon/lib/python2.7/site-packages/container_support/serving.py", line 161, in _invoke
self.transformer.transform(content, input_content_type, requested_output_content_type)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 255, in transform
return self.transform_fn(data, content_type, accepts), accepts
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 180, in f
prediction = self.predict_fn(input)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 195, in predict_fn
return self.proxy_client.request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 51, in request
return request_fn(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 77, in predict
request = self._create_predict_request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 94, in _create_predict_request
input_map = self._create_input_map(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 199, in _create_input_map
raise ValueError(msg.format(data))
ValueError: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
2018-02-01 17:21:08,384 ERROR - model server - Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
Traceback (most recent call last):
File "/opt/amazon/lib/python2.7/site-packages/container_support/serving.py", line 161, in _invoke
self.transformer.transform(content, input_content_type, requested_output_content_type)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 255, in transform
return self.transform_fn(data, content_type, accepts), accepts
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 180, in f
prediction = self.predict_fn(input)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/serve.py", line 195, in predict_fn
return self.proxy_client.request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 51, in request
return request_fn(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 77, in predict
request = self._create_predict_request(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 94, in _create_predict_request
input_map = self._create_input_map(data)
File "/opt/amazon/lib/python2.7/site-packages/tf_container/proxy_client.py", line 199, in _create_input_map
raise ValueError(msg.format(data))
ValueError: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
[2018-02-01 17:21:08,384] ERROR in serving: Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
2018-02-01 17:21:08,384 ERROR - model server - Unsupported request data format: [1].
Valid formats: tensor_pb2.TensorProto, dict<string, tensor_pb2.TensorProto> and predict_pb2.PredictRequest
10.32.0.2 - - [01/Feb/2018:17:21:08 +0000] "POST /invocations HTTP/1.1" 500 0 "-" "AHC/2.0"

I tried to send predict_pb2 object to model, but it failed.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions