Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add torchvision.transforms.Resize interpolation and antialias. #1441

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

hiyuh
Copy link
Contributor

@hiyuh hiyuh commented Feb 3, 2025

torchvision.transforms.Resize forced nearest interpolation and no antialias, but shouln't.
Based on my understanding, original torchvision.transforms.Resize calls like;

  • torchvision.transforms.Resize
    • torchvision.transforms.functional.resize
      • torchvision.transforms._functional_pil.resize
        • PIL.Image.Image.resize
      • torchvision.transforms._functional_tensor.resize
        • torch.nn.functional.interpolate

Note, this PR still keeps nearest interpolation and no antialias by default for torchvision.transforms.Resize to maximize compatibility for existing code using TorchSharp and results in incompatible to original torchvision.transforms.Resize default, however, it would be up to the upstream decision.
See also;

@hiyuh
Copy link
Contributor Author

hiyuh commented Feb 5, 2025

@dotnet-policy-service agree company="HACARUS"

@alinpahontu2912
Copy link
Member

Hey @hiyuh, can you rebase and update releasenotes?

Masaru Kimura added 2 commits March 25, 2025 23:11
torchvision.transforms.Resize forced nearest interpolation and no antialias,
but shouln't.
Based on my understanding, original torchvision.transforms.Resize calls like;
- torchvision.transforms.Resize
  - torchvision.transforms.functional.resize
    - torchvision.transforms._functional_pil.resize
      - PIL.Image.Image.resize
    - torchvision.transforms._functional_tensor.resize
      - torch.nn.functional.interpolate
Note, this PR still keeps nearest interpolation and no antialias by default
for torchvision.transforms.Resize to maximize compatibility for existing
code using TorchSharp and make it being incompatible to original
torchvision.transforms.Resize default, however, it would be up to
the upstream decision.
See also;
* https://pytorch.org/vision/main/generated/torchvision.transforms.Resize.html
* https://pytorch.org/vision/main/generated/torchvision.transforms.functional.resize.html
* https://pytorch.org/docs/stable/generated/torch.nn.functional.interpolate.html
@hiyuh hiyuh force-pushed the add-Resize-interpolation-antialias branch from edfd4f5 to f6d6213 Compare March 25, 2025 14:16
@hiyuh
Copy link
Contributor Author

hiyuh commented Mar 25, 2025

@alinpahontu2912
done, please go ahead.

@@ -727,9 +736,12 @@ public static Tensor resize(Tensor input, int height, int width, int? maxSize =
}
}

if (antialias && interpolation != InterpolationMode.Bilinear && interpolation != InterpolationMode.Bicubic)
antialias = false;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of changing user's intended value for antialias ; I would recommend to throw an informative ArgumentException to warn user such as antialias is true, however ...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ozanMSFT

Copy link
Contributor

@ozanMSFT ozanMSFT Apr 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hiyuh , thanks for the reference.

Yes, it seems torchvision also has similar behavior but they added a comment just in the relevant part:

        # We manually set it to False to avoid an error downstream in interpolate()
        # This behaviour is documented: the parameter is irrelevant for modes
        # that are not bilinear or bicubic. We used to raise an error here, but
        # now we don't as True is the default.

So, if we're not going with the ArgumentException:

  • Could you please add a comment for the explanation in antialias = false change by adding reference to torchvision 's relevant code link.

  • Could you please add an similar explanation (e.g. like below) in the /// <param name="antialias"> section?

    /// antialias value will be automatically set to `false` silently in case interpolation is not InterpolationMode.Bilinear or InterpolationMode.Bicubic

/// * true: will apply antialiasing for bilinear or bicubic modes. Other mode aren't affected. This is probably what you want to use.
/// * false (default, incompatible to Python's torchvision v0.17 or later for historical reasons): will not apply antialiasing on any mode.
/// </param>
static public ITransform Resize(int size, InterpolationMode interpolation = InterpolationMode.Nearest, int? maxSize = null, bool antialias = false)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There might be some users who are using this call with Resize(int size, int? maxSize = null) signature.

I suggest maintaining the signature order for maxSize and placing default parameters at the end to benefit existing users.

Copy link
Contributor Author

@hiyuh hiyuh Mar 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ozanMSFT

  • tried to add compatibility handling for maxSize.
  • reverted test/TorchSharpTest/TestTorchVision.cs change, to demonstrate compatibility handling.
  • this produces ambiguity errors from Resize(height, width) and resize(input, height, width).
    • is it possible to resolve this?
    • or TorchSharp upstream won't care about Python's argument order?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the change @hiyuh , it looks good.

Regarding your question, we definitely consider `PyTorch' 's signature and argument order; but also we should consider backward compatibility whenever it is possible/applicable.

this produces ambiguity errors from Resize(height, width) and resize(input, height, width)

I couldn't see ambiguity in this file, could you please give more details or example where this error is happening?

@@ -6668,6 +6668,18 @@ public void TestInterpolateTrilinear()
}
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please add more tests for antialias behaviors? (both for true and false scenarios)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ozanMSFT

  • i don't see any benefit to maintain test for antialias functionality in TorchSharp, since which is mostly offloaded to libtorch.
  • if you still consider to maintain test for antialias functionality in TorchSharp is mandatory, i can add some ones by copying the current output as expectation.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hiyuh , yes please, I kindly would like to ask you to add the test scenarios.

This is not only for the functionality but also for preventing the future accidental errors and create awareness regarding the antialias functionality.

@ozanMSFT
Copy link
Contributor

I added a few more comments.

@hiyuh I have some concerns regarding the compatibility between the current TorchSharp's TorchVision and your changes. Could you specify in which version of TorchVision the antialias parameter was introduced, and whether it is compatible with the current TorchSharp's native dependencies?

@ozanMSFT
Copy link
Contributor

@hiyuh ; if you can give us permission for your branch as following setting, we can also resolve some RELEASENOTES conflicts on behalf of you:

image

NOTE: We're planning a solution for the releasenotes conflicts.

@hiyuh
Copy link
Contributor Author

hiyuh commented Mar 28, 2025

@ozanMSFT

  • regarding your review comments on several source codes, let me try to improve, so wait a moment.
  • about allowing maintaner's change on pulll request, from my best knowledge, it's unavailble option if the forked branch is under organization.
    • next time, i can work under my account (not my organization) if you guys preferred.

@hiyuh
Copy link
Contributor Author

hiyuh commented Mar 31, 2025

@ozanMSFT

Could you specify in which version of TorchVision the antialias parameter was introduced, and whether it is compatible with the current TorchSharp's native dependencies?

@hiyuh hiyuh requested a review from ozanMSFT March 31, 2025 23:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants