Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failure of ShapeInference pass #224

Open
chentong319 opened this issue Jul 20, 2020 · 6 comments
Open

failure of ShapeInference pass #224

chentong319 opened this issue Jul 20, 2020 · 6 comments

Comments

@chentong319
Copy link
Collaborator

I think that ShapeInference pass should not simply fail the pass manager when shape for a tensor can not be inferred.
ShapeInference may be helped or avoided by other passes. Here are some examples:

  1. Constant folding may generate constant input for some operations, and then enables AttributePromotion, and finally helps ShapeInference.
  2. ScalerOp will be decomposed into other operations by canonicalization pass, which is invoked after first ShapeInference Pass.

My suggestions are

  1. ShapeInference fail only for semantic errors, such as an Attribute has incorrect value, or the shape of tensors does not match. Best we should put such checking into verifier.
  2. ShapeInference simply gives up (quietly or with some diagnosis) if some needed info is not available yet.
  3. Add a pass toe check whether all tensors have known shape at the right place, not mixed with ShapeInference.
@tungld
Copy link
Collaborator

tungld commented Jul 22, 2020

I agree with your suggestions. It would be more flexible.

Only a minor comment that verifying Attributes should be done in another pass outside or before ShapeInference.
I feel that checking and setting attributes are not the work for ShapeInference.

@tungld
Copy link
Collaborator

tungld commented Jul 22, 2020

ScalerOp will be decomposed into other operations by canonicalization pass, which is invoked after first ShapeInference Pass.

Could we put rewriting rules for ScalerOp to the Decomposing Pass: https://github.com/onnx/onnx-mlir/blob/master/src/Transform/ONNX/Decompose.td? By that way, we don't need to implement ShapeInference for ScalerOp since the Decomposing Pass is called before ShapeInference.

@chentong319
Copy link
Collaborator Author

Yes, we can try Decompose.td.

@doru1004
Copy link
Collaborator

doru1004 commented Nov 6, 2020

Any updates on this issue? @chentong319 @tungld

@chentong319
Copy link
Collaborator Author

We are taking a progressive way to change shape inference. Some constraints have to be removed when the support for dynamic shape is added.
For the passes, we will move to fix-point iterative approach.

@doru1004
Copy link
Collaborator

doru1004 commented Nov 6, 2020

Awesome, sounds like this is still in progress! Thanks for the update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants