Skip to content

Adding rank based logging for torch distributed examples #714

Adding rank based logging for torch distributed examples

Adding rank based logging for torch distributed examples #714

Triggered via pull request December 3, 2025 16:26
Status Failure
Total duration 1h 35m 18s
Artifacts 1

build-test-windows_rtx.yml

on: pull_request
generate-matrix  /  generate
7s
generate-matrix / generate
filter-matrix
8s
filter-matrix
substitute-runner
3s
substitute-runner
Matrix: RTX - Build Windows torch-tensorrt whl package
Matrix: L0 dynamo converter tests
Matrix: L0 dynamo core tests
Matrix: L0 core python tests
Matrix: L1 dynamo compile tests
Matrix: L1 dynamo core tests
Matrix: L1 torch compile tests
Matrix: L2 dynamo compile tests
Matrix: L2 dynamo core tests
Matrix: L2 dynamo plugin tests
Matrix: L2 torch compile tests
Fit to window
Zoom out
Zoom in

Annotations

21 errors
RTX - Build Windows torch-tensorrt whl package (pytorch/tensorrt, packaging/pre_build_script_wind... / build-wheel-py3_10-cuda13_0
Failed to CreateArtifact: Received non-retryable error: Failed request: (409) Conflict: an artifact with this name already exists on the workflow run
RTX - Build Windows torch-tensorrt whl package (pytorch/tensorrt, packaging/pre_build_script_wind... / build-wheel-py3_10-cuda13_0
Failed to CreateArtifact: Received non-retryable error: Failed request: (409) Conflict: an artifact with this name already exists on the workflow run
L2 dynamo compile tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-compile-tests--3.10-cu130
The self-hosted runner lost communication with the server. Verify the machine is running and has a healthy network connection. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.
L2 dynamo compile tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-compile-tests--3.10-cu130
The self-hosted runner lost communication with the server. Verify the machine is running and has a healthy network connection. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.
L2 dynamo plugin tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-plugin-tests--3.10-cu130
The self-hosted runner lost communication with the server. Verify the machine is running and has a healthy network connection. Anything in your workflow that terminates the runner process, starves it for CPU/Memory, or blocks its network access can cause this error.
L2 torch compile tests (pytorch/tensorrt, torch_tensorrt) / L2-torch-compile-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 dynamo core tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-core-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 dynamo core tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-core-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 torch compile tests (pytorch/tensorrt, torch_tensorrt) / L2-torch-compile-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 dynamo compile tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-compile-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 torch compile tests (pytorch/tensorrt, torch_tensorrt) / L2-torch-compile-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case
L2 dynamo core tests (pytorch/tensorrt, torch_tensorrt) / L2-dynamo-core-tests--3.10-cu130
No JUnit XML file was found. Set `fail-on-empty: false` if that is a valid use case

Artifacts

Produced during runtime
Name Size Digest
pytorch_tensorrt__3.10_cu130_x64
1.56 MB
sha256:1bdea4cfaf4b73aae47730f6cd395b49cae8d8a8243899d7e3eb199edb27f28d