Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature/onnx-to-tosa #4

Draft
wants to merge 1,699 commits into
base: main
Choose a base branch
from
Draft

feature/onnx-to-tosa #4

wants to merge 1,699 commits into from

Conversation

mgehre-amd
Copy link
Collaborator

DO NOT merge, just to look at our diff

@mgehre-amd mgehre-amd changed the title Feature/onnx_to_tosa feature/onnx-to-tosa Jun 26, 2023
jorickert and others added 29 commits December 6, 2024 07:28
[AutoBump] Merge with fixes of f5297b0 (Sep 04, requires LLVM bump) (6)
[AutoBump] Merge with c5d3e72 (Sep 06) (7)
[AutoBump] Merge with fixes of 0ac3dc9 (Sep 09) (8)
[AutoBump] Merge with c814ad0 (Sep 10) (9)
This reverts commit 94d2e0b, reversing
changes made to a0ace62.
…n because of unraked operand types (onnx#3023)

Signed-off-by: Jonas Rickert <[email protected]>
[AutoBump] Merge with fixes of 02f45b0 (Sep 11) (10)
[AutoBump] Merge with 97d497f (Sep 13) (11)
[AutoBump] Merge with fixes of 2f2ccc5 (Sep 13) and fix GroupNormV21 for ranks > 4 (12)
[AutoBump] Merge with 37b8393 (Sep 16) (13)
* Update Ops documentation for ONNX 1.16.2

* Fix format

---------

Co-authored-by: Megan Hampton <[email protected]>
[AutoBump] Merge with fixes of fd3eb99 (Sep 16) (14)
jimw567 and others added 30 commits February 11, 2025 14:45
Add options to specify directory with ONNX initializer files
…d_file

Remove accidentally commited file
[AutoBump] Merge with fixes of 86dbaf0 (Jan 08) (3)
[AutoBump] Merge with 0183ad9 (Jan 09) (5)
[AutoBump] Merge with fixes of 7b0dd65 (Jan 14) (6)
[AutoBump] Merge with fixes of b23daa9 (Jan 08) (4)
VAI-7388 convtranspose decomposition as foure phase conv2d
[AutoBump] Merge with 2a8b111 (Jan 29) (7)
[AutoBump] Merge with fixes of 2e4a46a (Jan 29) (8)
…295)

This PR will add support for recomposing RMSLayerNormalization from a pattern that uses Pow with exponent as -0.5 instead of a reciprocal of sqrt Op.
Support microsoft.Quantize/Dequantize linear
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.