Skip to content

Commit

Permalink
Merge branch 'update_hugectr_version_24.4.0' into 'main'
Browse files Browse the repository at this point in the history
Update new version: 24.4.0

See merge request dl/hugectr/hugectr!1530
  • Loading branch information
minseokl committed Apr 12, 2024
2 parents 19e6017 + c809630 commit e369846
Show file tree
Hide file tree
Showing 30 changed files with 59 additions and 59 deletions.
4 changes: 2 additions & 2 deletions HugeCTR/include/common.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -58,8 +58,8 @@

namespace HugeCTR {

#define HUGECTR_VERSION_MAJOR 23
#define HUGECTR_VERSION_MINOR 12
#define HUGECTR_VERSION_MAJOR 24
#define HUGECTR_VERSION_MINOR 4
#define HUGECTR_VERSION_PATCH 0

#define WARP_SIZE 32
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ If you'd like to quickly train a model using the Python interface, do the follow

1. Start a NGC container with your local host directory (/your/host/dir mounted) by running the following command:
```
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.12
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

**NOTE**: The **/your/host/dir** directory is just as visible as the **/your/container/dir** directory. The **/your/host/dir** directory is also your starting directory.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,12 +33,12 @@ HPS is available within the Merlin Docker containers, which can be accessed thro

To utilize these Docker containers, you will need to install the [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-docker) to provide GPU support for Docker.

The following sample commands pull and start the Merlin PyTorch container:
The following sample commands pull and start the Merlin HugeCTR container:

Merlin PyTorch
Merlin HugeCTR
```shell
# Run the container in interactive mode
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-pytorch:23.12
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

You can check the existence of the HPS plugin for Torch after launching the container by running the following Python statements:
Expand Down
4 changes: 2 additions & 2 deletions docs/source/hierarchical_parameter_server/profiling_hps.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,13 @@ To build HPS profiler from source, do the following:
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.12
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

3. Here is an example of how you can build HPS Profiler using the build options:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/hugectr_user_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ The following sample command pulls and starts the Merlin Training container:

```shell
# Run the container in interactive mode
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.12
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

### Building HugeCTR from Scratch
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/hps_cc/config.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
*/
#pragma once

// TODO: The configurations are not needed anymore in merlin-base:23.12
// TODO: The configurations are not needed anymore in merlin-base:24.04
// #include <absl/base/options.h>
// #undef ABSL_OPTION_USE_STD_STRING_VIEW
// #define ABSL_OPTION_USE_STD_STRING_VIEW 0
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hierarchical_parameter_server_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_multi_table_sparse_input_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_pretrained_model_training_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_table_fusion_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
14 changes: 7 additions & 7 deletions hps_tf/notebooks/hps_tensorflow_triton_deployment_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down Expand Up @@ -854,9 +854,9 @@
"INFO:tensorflow:Automatic mixed precision has been deactivated.\n",
"2022-11-23 01:37:23.028482: I tensorflow/core/grappler/devices.cc:66] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 1\n",
"2022-11-23 01:37:23.028568: I tensorflow/core/grappler/clusters/single_machine.cc:358] Starting new session\n",
"2022-11-23 01:37:23.121909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:23.128593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:23.129761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"2022-11-23 01:37:24.041909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:24.048593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:24.049761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"\n",
"################################################################################\n",
"TensorRT unsupported/non-converted OP Report:\n",
Expand All @@ -872,9 +872,9 @@
"For more information see https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html#supported-ops.\n",
"################################################################################\n",
"\n",
"2022-11-23 01:37:23.129860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:23.129893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:23.120667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
"2022-11-23 01:37:24.049860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:24.049893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:24.040667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/sok_to_hps_dlrm_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get SOK from NGC\n",
"\n",
"Both SOK and HPS Python modules are preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"Both SOK and HPS Python modules are preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
6 changes: 3 additions & 3 deletions hps_torch/notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ If you prefer to build the HugeCTR Docker image on your own, refer to [Set Up th
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-pytorch:23.12
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

### Clone the HugeCTR Repository
Expand All @@ -28,7 +28,7 @@ git clone https://github.com/NVIDIA/HugeCTR
1. Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --runtime=nvidia --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr -p 8888:8888 nvcr.io/nvidia/merlin/merlin-pytorch:23.12
docker run --runtime=nvidia --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

2. Start Jupyter using these commands:
Expand All @@ -55,4 +55,4 @@ The specifications of the system on which each notebook can run successfully are

| Notebook | CPU | GPU | #GPUs | Author |
| -------- | --- | --- | ----- | ------ |
| [hps_torch_demo.ipynb](hps_torch_demo.ipynb) | Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz<br />512 GB Memory | Tesla V100-SXM2-32GB<br />32 GB Memory | 1 | Kingsley Liu |
| [hps_torch_demo.ipynb](hps_torch_demo.ipynb) | Intel(R) Xeon(R) CPU E5-2698 v4 @ 2.20GHz<br />512 GB Memory | Tesla V100-SXM2-32GB<br />32 GB Memory | 1 | Kingsley Liu |
2 changes: 1 addition & 1 deletion hps_torch/notebooks/hps_torch_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.12 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.12`.\n",
"The HPS Python module is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
10 changes: 5 additions & 5 deletions hps_trt/notebooks/benchmark_tf_trained_large_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1279,17 +1279,17 @@
" ```shell\n",
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
" cd Merlin/docker\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.12 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 -f dockerfile.tf .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:24.04 -f dockerfile.merlin.ctr .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 -f dockerfile.ctr .\n",
" cd ../..\n",
" ```\n",
"- **Option B (G+H optimized HugeCTR)**:\n",
" ```shell\n",
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
" cd Merlin/docker\n",
" sed -i -e 's/\" -DENABLE_INFERENCE=ON/\" -DUSE_HUGE_PAGES=ON -DENABLE_INFERENCE=ON/g' dockerfile.merlin\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.12 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 -f dockerfile.tf .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:24.04 -f dockerfile.merlin.ctr .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 -f dockerfile.ctr .\n",
" cd ../..\n",
" ````"
]
Expand Down Expand Up @@ -1325,7 +1325,7 @@
"\n",
"Your filesystem or system environment might impose constraints. The following command just serves as an example. It assumes HugeCTR was downloaded from GitHub into the current working directory (`git clone https://github.com/NVIDIA-Merlin/HugeCTR.git`). To allow writing files, we first give root user (inside the docker image you are root) to access to the notebook folder (this folder), and then startup a suitable Jupyter server.\n",
"```shell\n",
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-tensorflow:23.12 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-hugectr:24.04 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
"``` "
]
},
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_hugectr_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.12`.\n",
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_pytorch_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.12`.\n",
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_tf_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.12 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12`.\n",
"The HPS TensorRT plugin is preinstalled in the 24.04 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:24.04`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
6 changes: 3 additions & 3 deletions notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ git clone https://github.com/NVIDIA/HugeCTR
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR --network=host --runtime=nvidia nvcr.io/nvidia/merlin/merlin-hugectr:23.12
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR --network=host --runtime=nvidia nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.12` container.
> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:24.04` container.
## 3. Customized Building (Optional)

Expand Down
4 changes: 2 additions & 2 deletions release_notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -252,7 +252,7 @@ In this release, we have fixed issues and enhanced the code.
```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.3`.
Afterward, the library will use calendar versioning only, such as `v23.12`.
Afterward, the library will use calendar versioning only, such as `v23.01`.
```
+ **Support for BERT and Variants**:
Expand Down Expand Up @@ -334,7 +334,7 @@ The [HugeCTR Training and Inference with Remote File System Example](https://nvi
```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.2`.
Afterward, the library will use calendar versioning only, such as `v23.12`.
Afterward, the library will use calendar versioning only, such as `v23.01`.
```
+ **Change to HPS with Redis or Kafka**:
Expand Down
4 changes: 2 additions & 2 deletions samples/criteo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.12
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.12
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:24.04
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
Loading

0 comments on commit e369846

Please sign in to comment.