diff --git a/doc/source/analytics/explainers.md b/doc/source/analytics/explainers.md index a0eef802f3..2a3719d995 100644 --- a/doc/source/analytics/explainers.md +++ b/doc/source/analytics/explainers.md @@ -45,7 +45,7 @@ For an e2e example, please check AnchorTabular notebook [here](../examples/iris_ ## Explain API -**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1. +**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1. We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience. diff --git a/doc/source/graph/protocols.md b/doc/source/graph/protocols.md index c30946a88b..9f7a72bcb2 100644 --- a/doc/source/graph/protocols.md +++ b/doc/source/graph/protocols.md @@ -1,6 +1,6 @@ # Protocols -**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1. +**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1. We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience. diff --git a/doc/source/production/optimization.md b/doc/source/production/optimization.md index a2e402f7e4..5977907435 100644 --- a/doc/source/production/optimization.md +++ b/doc/source/production/optimization.md @@ -9,7 +9,7 @@ Using the Seldon python wrapper there are various optimization areas one needs t ### Seldon Protocol Payload Types with REST and gRPC -**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1. +**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1. We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience. diff --git a/doc/source/reference/upgrading.md b/doc/source/reference/upgrading.md index c504a755fd..513cfdd4d2 100644 --- a/doc/source/reference/upgrading.md +++ b/doc/source/reference/upgrading.md @@ -93,7 +93,7 @@ Only the v1 versions of the CRD will be supported moving forward. The v1beta1 ve ### Model Health Checks -**Note**:Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1. +**Note**:Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1. We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience. We have updated the health checks done by Seldon for the model nodes in your inference graph. If `executor.fullHealthChecks` is set to true then: diff --git a/examples/models/lightgbm_custom_server/iris.ipynb b/examples/models/lightgbm_custom_server/iris.ipynb index c0020fd453..b3fff88937 100644 --- a/examples/models/lightgbm_custom_server/iris.ipynb +++ b/examples/models/lightgbm_custom_server/iris.ipynb @@ -7,7 +7,7 @@ "source": [ "# Custom LightGBM Prepackaged Model Server\n", "\n", - "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1.\n", + "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1.\n", "\n", "We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience.\n", "\n", diff --git a/notebooks/backwards_compatability.ipynb b/notebooks/backwards_compatability.ipynb index b2f4139a48..0b4d62a074 100644 --- a/notebooks/backwards_compatability.ipynb +++ b/notebooks/backwards_compatability.ipynb @@ -13,7 +13,7 @@ " * grpcurl\n", " * pygmentize\n", "\n", - "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1.\n", + "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1.\n", "\n", "We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience.\n", "\n", diff --git a/notebooks/protocol_examples.ipynb b/notebooks/protocol_examples.ipynb index a3a66d701d..dc188b8753 100644 --- a/notebooks/protocol_examples.ipynb +++ b/notebooks/protocol_examples.ipynb @@ -19,7 +19,7 @@ " * [Seldon Protocol](#Seldon-Protocol-Model)\n", " * [Tensorflow Protocol](#Tensorflow-Protocol-Model)\n", "\n", - "**Note**:Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1.\n", + "**Note**:Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1.\n", "\n", "We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience.\n", "\n", diff --git a/notebooks/server_examples.ipynb b/notebooks/server_examples.ipynb index 98fb280b96..4716e109c8 100644 --- a/notebooks/server_examples.ipynb +++ b/notebooks/server_examples.ipynb @@ -65,7 +65,7 @@ "source": [ "## Serve SKLearn Iris Model\n", "\n", - "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://github.com/SeldonIO/MLServer) for model serving in Seldon Core 1.\n", + "**Note**: Seldon is no longer maintaining the Seldon and TensorFlow protocols. Instead, Seldon is adopting the industry-standard Open Inference Protocol (OIP). As part of this transition, you need to use [MLServer](https://docs.seldon.ai/mlserver) for model serving in Seldon Core 1.\n", "\n", "We strongly encourage you to adopt the OIP, which provides seamless integration across diverse model serving runtimes, supports the development of versatile client and benchmarking tools, and ensures a high-performance, consistent, and unified inference experience.\n", "\n",