-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error with Prediction from TensorFlow Object Detection ONNX #9
Comments
@ansarisam I think the model expects to receive the data in byte, not float, you can try to replace the line
by
|
Thanks for the reply. |
@ansarisam There is a thread (microsoft/onnxruntime#6261) discussing the issue, currently BTW, is there an ONNX model that I can reproduce the error above? so that I can verify it after the update done |
@ansarisam Could you try the latest code of ai-serving? which can support the uint8 now, invoke the following code to convert to uint8:
NOTE: the new docker images with the fix are not ready yet, we're working on them. You need to compile the code to try. Please let me know if you have any problems. |
I am getting an error when trying to predict from ONNX model (TensorFlow based Object Detection). When I call the api-serving Rest API, I get the following error
b'{"error":"Shape [640, 640, 3], requires 1228800 elements but the buffer has 9830400 elements."}'
Here is the code that is creating the input and calling the Rest API.
The text was updated successfully, but these errors were encountered: