You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the problem the feature is intended to solve
TensorFlow's pluggable device architecture offers a plugin mechanism for registering devices with TensorFlow without the need to make changes in TensorFlow code. It provides a set of C API as an ABI-stable way to register a custom device runtime, kernels/ops, graph optimizer and profiler.
With this, developing support for 3rd party custom devices in TensorFlow is greatly simplified. However, its not clear if these plugins can work with TF-Serving. I can find documentations for serving TensorFlow models with custom ops by copying over source into Serving project and building static library for the op. However I couldn't find anything for custom device nor pluggable device for TFServing.
I would appreciate any documentation or instructions for Serving with custom/pluggable 3rd party devices. If this is not currently supported, any information on plans for future support would be helpful.
Thanks
Describe the solution
Pluggable device to be compatible with TFServing
Describe alternatives you've considered
Considered custom ops that could be used to define ops/kernels but lacks graph optimization and memory management.
Additional context
Add any other context or screenshots about the feature request here.
Bug Report
If this is a bug report, please fill out the following form in full:
System information
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 22.04
TensorFlow Serving installed from (source or binary): Source
TensorFlow Serving version: 2.7
The text was updated successfully, but these errors were encountered:
pksubbarao
changed the title
Support pluggable devices
Support for TensorFlow pluggable devices
Aug 11, 2022
Describe the problem the feature is intended to solve
TensorFlow's pluggable device architecture offers a plugin mechanism for registering devices with TensorFlow without the need to make changes in TensorFlow code. It provides a set of C API as an ABI-stable way to register a custom device runtime, kernels/ops, graph optimizer and profiler.
With this, developing support for 3rd party custom devices in TensorFlow is greatly simplified. However, its not clear if these plugins can work with TF-Serving. I can find documentations for serving TensorFlow models with custom ops by copying over source into Serving project and building static library for the op. However I couldn't find anything for custom device nor pluggable device for TFServing.
I would appreciate any documentation or instructions for Serving with custom/pluggable 3rd party devices. If this is not currently supported, any information on plans for future support would be helpful.
Thanks
Describe the solution
Pluggable device to be compatible with TFServing
Describe alternatives you've considered
Considered custom ops that could be used to define ops/kernels but lacks graph optimization and memory management.
Additional context
Add any other context or screenshots about the feature request here.
Bug Report
If this is a bug report, please fill out the following form in full:
System information
The text was updated successfully, but these errors were encountered: