Skip to content

Issues: onnx/onnx-mlir

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Questions about the Krnl dialect documentation Improvements or additions to documentation. question Further information is requested,
Translation from ONNX type to MLIR type ONNX ingestion Support of ingestion of ONNX model into the ONNX-MLIR framework, e.g. ops or models.
KRNL Lowering for CPUs with Optimizations KRNL IR and lowering Support for lowering of KRNL IR to lower MLIR dialects.
Questions about graph optimizations ONNX IR and lowering Support for the lowering of ONNX operations. optimizations optimizations
Naive cuda support enhancement New feature or request.
#198 opened Jun 29, 2020 by feiyuvl
SIMD code generation KRNL IR and lowering Support for lowering of KRNL IR to lower MLIR dialects.
Test cases for the ONNX importer
#218 opened Jul 14, 2020 by tungld
failure of ShapeInference pass
#224 opened Jul 20, 2020 by chentong319
Make llvm a submodule?
#254 opened Aug 7, 2020 by awf
question about export support
#298 opened Sep 11, 2020 by daquexian
ProTip! Exclude everything labeled bug with -label:bug.