Skip to content

Dynamic Shape Simple NN Model Test #4074

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

miladm
Copy link
Collaborator

@miladm miladm commented Oct 6, 2022

Testing dynamic shape functionality using a simple model. This PR is meant to identify the gaps in dynamic shape op support. We will keep track of the bugs we address as part of the investigation in this PR.


Blockers

import torch_xla.core.xla_model as xm
import numpy

pd = torch._C._EnablePythonDispatcher()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder what this line does

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it enables us to run python implementations of CompositeAutogradImplicit ops.
CompositeAutogradImplicit means we don't have an explicit backward formula for an op instead an op is composed of a bunch of ops that do have backward formulas and combines this formulas is equivalent to differentiating the op explicitly.

@miladm miladm added this to the Dynamic Shape milestone Oct 6, 2022
@ysiraichi ysiraichi added DO_NOT_MERGE Not for merging. and removed DO_NOT_MERGE_YET labels Mar 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
DO_NOT_MERGE Not for merging. dynamism Dynamic Shape Features
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants