You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have converted the MiDaS 2.1 ONNX model into TFLite float32 and float16 using ONNX2TF.
I tried the float32 and float16 models with GPU(Metal Delegate) and NPU(Core ML Delegate) on iPhone 15 Pro MAX, all good but the result is strange when using NPU(Core ML Delegate) with float16 as below:
The original image
Model produces results using Core ML Delegate with float16 model.
Model produces results using Metal Delegate with float16 model.
The text was updated successfully, but these errors were encountered:
TimYao18
changed the title
Is there MiDaS 2.1 TFLite fp16 model for sharing?
MiDaS 2.1 TFLite fp16 with Core ML Delegate gets wrong results
Jan 15, 2024
I have converted the MiDaS 2.1 ONNX model into TFLite float32 and float16 using ONNX2TF.
I tried the float32 and float16 models with GPU(Metal Delegate) and NPU(Core ML Delegate) on iPhone 15 Pro MAX, all good but the result is strange when using NPU(Core ML Delegate) with float16 as below:
The original image
Model produces results using Core ML Delegate with float16 model.
Model produces results using Metal Delegate with float16 model.
Since the TensorFlow Core ML Delegate official site said it support float16 and float32 model, is there any one tried the fp16 model with iOS NPU(Core ML Delegate) with normal result, or also met the same issue? The model I converted into fp16 is here
The text was updated successfully, but these errors were encountered: