Skip to content

Changes to TRT-LLM download tool for multigpu distributed case #3784

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 1 commit into
base: nccl_ops_trt_llm_installation
Choose a base branch
from

Conversation

apbose
Copy link
Collaborator

@apbose apbose commented Aug 19, 2025

The download utility tool in the previous tool had a functional issue for distributed cases. This PR should address that

@apbose apbose marked this pull request as draft August 19, 2025 06:39
@meta-cla meta-cla bot added the cla signed label Aug 19, 2025
@github-actions github-actions bot added component: tests Issues re: Tests component: conversion Issues re: Conversion stage component: api [Python] Issues re: Python API component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths labels Aug 19, 2025
@github-actions github-actions bot requested a review from gs-olive August 19, 2025 06:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed component: api [Python] Issues re: Python API component: conversion Issues re: Conversion stage component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths component: tests Issues re: Tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant