Skip to content

Releases: intel/onnxruntime

Custom Release OpenVINO™ Execution Provider for OnnxRuntime 1.15

03 Oct 14:26
3c47cf2
Compare
Choose a tag to compare

We are releasing Custom OpenVINO™ Execution Provider for OnnxRuntime 1.15 with depreciating OpenVINO 1.0 API and increasing operator coverage. This release is based on OpenVINO™ 2023.1.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Announcements:

  1. OpenVINO™ version upgraded to 2023.1.0. This provides functional bug fixes, and capability changes from the previous 2023.0.0 release.
  2. Improved FIL with custom OpenVINO API for model loading across CPU and GPU accelerators.
  3. Added bug fixes for model caching feature.
  4. Operator coverage compliant with OV 2023.1

Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

OpenVINO Execution Provider for OnnxRuntime 5.0

21 Jun 02:28
Compare
Choose a tag to compare

Description:
OpenVINO™ Execution Provider For ONNXRuntime v5.0 Release based on the latest OpenVINO™ 2023.0 Release and OnnxRuntime 1.15 Release.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Announcements:

  1. OpenVINO™ version upgraded to 2023.0.0. This provides functional bug fixes, and capability changes from the previous 2022.3.0 release.
  2. This release supports ONNXRuntime 1.15 with the latest OpenVINO™ 2023.0 release.
  3. Hassle free user experience for OVEP Python developers on windows platform. Just PIP install is all you required on windows now.
  4. Complete full model support for stable Diffusion with dynamic shapes on CPU/GPU.
  5. Improved FIL with custom OpenVINO API for model loading.
  6. Model caching is now generic across all accelerators. Kernel caching is enabled for partially supported models.

Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Package:
https://pypi.org/project/onnxruntime-openvino/

Installation and usage Instructions on Windows:

  pip install onnxruntime-openvino
  pip install openvino
  
 <Add these 2 lines in the application code>

 import onnxruntime.tools.add_openvino_win_libs as utils
 utils.add_openvino_libs_to_path()

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

Custom Release Branch OVEP 1.14

03 Apr 14:06
3ebf123
Compare
Choose a tag to compare
Pre-release

We are releasing Custom Release for 1.14 with specific changes for Model Caching and improving First Inference Latency
This release is based on custom OpenVINO™. Dependent OpenVINO™ libs are part of zip file.

  • Added additional ONNX op support coverage.
  • Improved FIL with custom OpenVINO API for model loading.
  • Model caching along with Kernel caching is enabled.
  • Handled fallback at session creation time at the application level.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Custom Release Branch OVEP1.13

03 Apr 14:06
Compare
Choose a tag to compare
Pre-release

We are releasing Custom Release for 1.13.1 with specific changes for Model Caching and improving First Inference Latency
This release is based on custom OpenVINO™. Dependent OpenVINO™ libs are part of zip file.

  • Added additional ONNX op support coverage.
  • Improved FIL with custom OpenVINO API for model loading.
  • Model caching along with Kernel caching is enabled.
  • Handled fallback at session creation time at the application level.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

OpenVINO EP v4.3 Release for ONNX Runtime & OpenVINO 2022.3

03 Apr 11:58
Compare
Choose a tag to compare

Description:
OpenVINO™ Execution Provider For ONNXRuntime v4.3 Release based on the latest OpenVINO™ 2022.3 Release.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Announcements:

  • OpenVINO™ version upgraded to 2022.3.0. This provides functional bug fixes, and capability changes from the previous 2022.2.0 release.
  • This release supports ONNXRuntime with the latest OpenVINO™ 2022.3 release.
  • Improvement in the First Inference Latency for OnnxRuntime OpenVino Execution Provider.
  • Model caching along with Kernel caching is enabled for GPU.
  • Minor bug fixes and code refactoring is done
  • Migrated to OpenVINO™ 2.0 API's. Removed support for OpenVINO™ 1.0 ( v2021.3 and v2021.4)
  • Backward compatibility support for older OpenVINO™ versions (OV 2022.2, OV 2022.2) is available.
  • Replacing the API's for model caching use_compile_network and blob_dump_path with single cache_dir in session creation API.

Build steps:
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Package:
https://pypi.org/project/onnxruntime-openvino/

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

OpenVINO EP v4.2 Release for ONNX Runtime & OpenVINO 2022.2

04 Oct 03:07
6c63c1c
Compare
Choose a tag to compare

Description:
OpenVINO™ Execution Provider For ONNXRuntime v4.2 Release based on the latest OpenVINO™ 2022.2 Release.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Announcements:

  • OpenVINO™ version upgraded to 2022.2.0. This provides functional bug fixes, and capability changes from the previous 2022.1.0 release.
    This release supports ONNXRuntime with the latest OpenVINO™ 2022.2 release.

  • NEW: Preview support for Intel’s discrete graphics cards, Intel® Data Center GPU Flex Series, and Intel® Arc™ GPU for DL inferencing workloads in the intelligent cloud, edge, and media analytics workloads.

  • NEW: Support for Intel 13th Gen Core Processor for desktop (code-named Raptor Lake).

  • Exhaustive Coverage of ONNX Operators Unit and Python Tests for GPU Plugin.

  • Support for Int8 QDQ Model from NNCF

  • Backward compatibility support for older OpenVINO™ versions (OV 2022.1, OV 2021.4) is available.

New features added:
CPU FP16: Support for CPU FP16 Precision type

Build steps:
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

OpenVINO EP v4.0 Release for ONNX Runtime 1.11.0 & OpenVINO 2022.1

10 Apr 06:04
Compare
Choose a tag to compare

Description:
OpenVINO™ Execution Provider For ONNXRuntime v4.0 Release based on the latest OpenVINO™ 2022.1 Release.

For all the latest information, Refer to our official documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

Announcements:

  • OpenVINO™ version upgraded to 2022.1.0 - biggest OpenVINO™ upgrade in 3.5 years. This provides functional bug fixes, API Change 2.0 and capability changes from the previous 2021.4.2 LTS release.
  • This release supports ONNXRuntime v1.11.0 Release with the latest OpenVINO™ 2022.1 release.
  • Performance Optimizations of existing supported models.
  • Updated to use the new OpenVINO™ 2.0 API's starting from this release.
  • Backward compatibility support for older OpenVINO™ versions (OV 2021.3, OV 2021.4) is available.
  • New code design changes introduced to have cleaner code structure for different OpenVINO™ versions.
  • Opset 13 compliance w.r.t OpenVINO™ (More operators added)

New features added:
OpenCL Throttling: CPU optimization using cl_throttle – Reduced CPU utilizations when iGPU path is enabled.
Device Type checks: checks if the application's runtime device type is available in the host machine.

Build steps:
Please refer to the OpenVINO™ Execution Provider For ONNXRuntime build instructions for information on system pre-requisites as well as instructions to build from source.
https://onnxruntime.ai/docs/build/eps.html#openvino

Samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Wheel Packages:
Please download onnxruntime-openvino python packages from PyPi.org.

pip install onnxruntime-openvino==1.11.0

Linux Wheel Packages come with prebuilt OpenVINO™ libs enabled with ABI 0 .
For Windows Samples please install and add OpenVINO™ libs to path.

pip install openvino

import openvino.utils as utils
utils.add_openvino_libs_to_path()

Docker Image:
The latest OpenVINO™ Execution Provider For ONNXRuntime Docker image can be downloaded from DockerHub.
https://hub.docker.com/r/openvino/onnxruntime_ep_ubuntu18

Nuget Package
The latest OpenVINO™ Execution Provider For ONNXRuntime nuget packages for both Linux and Windows OS can be downloaded from the assets section below.

ONNXRuntime APIs usage:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

Note:
This release has few failing python tests which would be fixed in the next release.

OpenVINO EP V3.4 Release for ONNX Runtime 1.10.0 & OpenVINO 2021.4.2

24 Nov 17:22
Compare
Choose a tag to compare

Description:
Version update for OpenVINO Execution Provider based on the OpenVINO release 2021.4.2

Announcements:
This release:

  • Now supports Openvino EP v3.4 Release. (compatible with OpenVINO 2021.4.2 Release)
  • This Release points to ONNXRuntime 1.10.0 Release.
  • New features added:

Support for Weights saved in external files
Support for IO Buffer Optimization
Model caching feature for OpenVINO EP
Auto-Device Execution for OpenVINO EP
New Linux and windows samples added

Build steps:

OpenVINO-EP samples:
https://github.com/microsoft/onnxruntime-inference-examples

Python Pip Wheel packages:
Find below pre-built Python wheel packages (zipped) that can be installed on a dev machine with OpenVINO 2021.4.2 installed. Please refer to the OpenVINO EP build instructions for information on system pre-requisites.

Docker Image:
The latest OpenVINO EP docker image can be downloaded from dockerhub.
https://hub.docker.com/r/openvino/onnxruntime_ep_ubuntu18

Nuget Package
The latest OpenVINO EP Nuget Package for Windows can be downloaded from below repo

APIs:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

For all the latest information, Refer to our documentation:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

OpenVINO EP V3.1 Release for ONNX Runtime 1.9 & OpenVINO 2021.4.1

24 Sep 11:06
Compare
Choose a tag to compare

Description:
Version update for OpenVINO Execution Provider based on the OpenVINO release 2021.4.1

Announcements:
This release:

  • Now supports Openvino EP v3.1 Release. (compatible with OpenVINO 2021.4.1 Release)
  • This Release points to ONNXRuntime 1.9 Release v1.9.0 Tag
    Note: The actual ONNXRuntime 1.9 Release has a minor bug for OpenVINO-EP, this release UEP V3.1 has the right fix already integrated for it.
  • some minor bug fixes added and code refactored.
  • Removed support for OpenVINO 2020.3 LTS and OpenVINO 2021.1 version
  • Fixed Yolo object Detection Models. Now, most of the Yolo models are now fully supported on OpenVINO-EP backend.

Building from source
Please refer to the OpenVINO EP build instructions for information on system pre-requisites as well as instructions to build from source.

Python Pip Wheel packages
Find below pre-built Python wheel packages (zipped) that can be installed on a dev machine with OpenVINO installed. Please refer to the OpenVINO EP build instructions for information on system pre-requisites.

Docker Image
The latest OpenVINO EP docker image can be downloaded from dockerhub.

APIs:
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

Performance:
The performance of yolo object detection models has been significantly improved across different hardware devices.

For all the latest information, Refer to our documentation.
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html

OpenVINO EP V3.0 Release for ONNX Runtime 1.8.1 & OpenVINO 2021.4

20 Jul 16:56
Compare
Choose a tag to compare

Description:
Version update for OpenVINO Execution Provider based on the OpenVINO release 2021.4.

Announcements:
This release:
Now supports Openvino EP v3.0 Release. (compatible with OpenVINO 2021.4 Release)
Introduced ReadNetwork() API starting from this release instead of ONNX Importer* API
Introduced INT8 Quantization for CPU and GPU (Beta stage)

Building from source
Please refer to the OpenVINO EP build instructions for information on system pre-requisites as well as instructions to build from source.

Python Pip Wheel packages
Find below pre-built Python wheel packages (zipped) that can be installed on a dev machine with OpenVINO installed. Please refer to the OpenVINO EP build instructions for information on system pre-requisites.

Docker Image
The latest OpenVINO EP docker image can be downloaded from dockerhub.

APIs
Please refer to the link below for Python/C++ APIs:
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html#configuration-options

Performance
Supports Yolov5 model on CPU, GPU and MyriadX

For all the latest information, Refer to our documentation.
https://onnxruntime.ai/docs/execution-providers/OpenVINO-ExecutionProvider.html