-
Notifications
You must be signed in to change notification settings - Fork 618
Decompose aten.channel_shuffle op (#4243) #4259
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…issue to handle this. For now filtering out failing tests.
…ffle operator was already included.
Hi all, this is a reminder to please provide feedback to this PR to add support of the channel shuffle operation in torch to linalg lowering. Thank you @newling @silvasean @rsuderman @zjgarvey @penguin-wwy @rafaelubalmw @sahas3 @vinitdeodhar @alaa-ali @dixinzhou @ramiro050 @qedawkins |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with a minor nit.
Please wait for other reviewers to review before merging. Thanks!
|
||
|
||
class ChannelShuffleDynamicDims(torch.nn.Module): | ||
# Basic test case for ChannelShuffle operation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Basic test case for ChannelShuffle operation. -> Basic test case for ChannelShuffle operation for dynamic dimensions
Support for the channel shuffle operator is added by torch dialect level decomposition (similar to the pixel_shuffle operation).
The decomposition is based on this specification:
https://docs.pytorch.org/docs/stable/generated/torch.nn.ChannelShuffle.html
and implementation:
aten/src/ATen/native/ChanelShuffle.cpp
https://github.com/pytorch/pytorch/blob/23491519d288dedb2a54cfad5fef7fcb2ad8eade/aten/src/ATen/native/ChanelShuffle.cpp#L4
Note that the operator consists of an expansion, expanded channel dimensions permute, and contraction of channel dimensions back to the original size. For example, for an input array of shape 1x8x4x4 with a group size of 2 would generate the MLIR linalg code below.
module {
func.func @channel_shuffle(%arg0: !torch.vtensor<[1, 8, 4, 4], f32>) -> !torch.vtensor<[1, 8, 4, 4], f32> {
%c0 = torch.constant.int 0
%c1 = torch.constant.int 1
%c2 = torch.constant.int 2
%c3 = torch.constant.int 3
%c4 = torch.constant.int 4
%dims = torch.prim.ListConstruct %c0, %c2, %c1, %c3, %c4 : (!torch.int, !torch.int, !torch.int, !torch.int, !torch.int) -> !torch.list
}
}
References:
PyTorch ChannelShuffle definition:
https://docs.pytorch.org/docs/stable/generated/torch.nn.ChannelShuffle.html
ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices (2017):
https://arxiv.org/pdf/1707.01083
A Lightweight Dendritic ShuffleNet for Medical Image Classification (2025)
https://www.jstage.jst.go.jp/article/transinf/advpub/0/advpub_2024EDP7059/_pdf
PyTorch implementation:
aten/src/ATen/native/ChanelShuffle.cpp
https://github.com/pytorch/pytorch/blob/23491519d288dedb2a54cfad5fef7fcb2ad8eade/aten/src/ATen/native/ChanelShuffle.cpp#L4
Resolves #4243