-
Notifications
You must be signed in to change notification settings - Fork 665
Update channels last python reference to not use memory_format=channels_last #14035
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/14035
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 38 Pending, 5 Unrelated FailuresAs of commit 8a1875e with merge base c3b842f ( NEW FAILURE - The following job has failed:
FLAKY - The following jobs failed but were likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
This PR needs a
|
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: Pull Request resolved: pytorch#14035 The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
434d54a
to
21c59c6
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
21c59c6
to
7c4efe2
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: Pull Request resolved: pytorch#14035 The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
7c4efe2
to
2956f93
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
2956f93
to
1e2a0d1
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: Pull Request resolved: pytorch#14035 The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
1e2a0d1
to
538169e
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
538169e
to
823bb90
Compare
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: Pull Request resolved: pytorch#14035 The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory. Differential Revision: D81842686
823bb90
to
e1de305
Compare
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
e1de305
to
8a1875e
Compare
This pull request was exported from Phabricator. Differential Revision: D81842686 |
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
…ls_last (pytorch#14035) Summary: Pull Request resolved: pytorch#14035 Our implementation is actually supposed to assume input shapes come in channels last, not relying on torch channels last memory format. Same thing with output shapes. Differential Revision: D81842686
Summary: The default overload of custom channels last assumes that inputs and weights are permuted and contiguous in memory.
Differential Revision: D81842686