1

I created training job in sagemaker with my own training and inference code using MXNet framework. I am able to train the model successfully and created endpoint as well. But while inferring the model, I am getting the following error:

‘ClientError: An error occurred (413) when calling the InvokeEndpoint operation: HTTP content length exceeded 5246976 bytes.’

What I understood from my research is the error is due to the size of the image. The image shape is (480, 512, 3). I trained the model with images of same shape (480, 512, 3).

When I resized the image to (240, 256), the error was gone. But producing another error 'shape inconsistent in convolution' as I the trained the model with images of size (480, 512).

I didn’t understand why I am getting this error while inferring. Can't we use images of larger size to infer the model? Any suggestions will be helpful

Thanks, Harathi

Harathi
  • 999
  • 2
  • 7
  • 8
  • SageMaker invoke-endpoint has a 5MB limit on the body. Why do you exceed it with you input if you resize it to the above image shape? – Guy May 10 '18 at 16:27

1 Answers1

0

I think this has to do with SageMaker's POST response having a size limit. I didn't see an endpoint configuration option for this in the documentation.

You should be able to manually set the size limit this if you make a docker of your model, and then modify the client_max_body_size, e.g. here in the decision_trees example: https://github.com/awslabs/amazon-sagemaker-examples/blob/master/advanced_functionality/scikit_bring_your_own/container/decision_trees/nginx.conf#L23

Alex R.
  • 1,397
  • 3
  • 18
  • 33
  • I am not making docker of my model. I am following Sagemaker python SDK tutorials. https://github.com/awslabs/amazon-sagemaker-examples/tree/master/sagemaker-python-sdk/mxnet_gluon_cifar10. – Harathi May 08 '18 at 21:07