Skip to content

Commit

Permalink
feat: update v5.2
Browse files Browse the repository at this point in the history
  • Loading branch information
byshiue committed Dec 2, 2022
1 parent b835743 commit 4c014fb
Show file tree
Hide file tree
Showing 841 changed files with 1,892,441 additions and 257,080 deletions.
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,8 @@ __pycache__/
.vscode
./translation
.cache
*.npy
*.pth
!tests/data/**/*.npy
/models
**/.ipynb_checkpoints/
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,6 @@
[submodule "examples/pytorch/vit/ViT-quantization/ViT-pytorch"]
path = examples/pytorch/vit/ViT-quantization/ViT-pytorch
url = https://github.com/jeonsworld/ViT-pytorch
[submodule "3rdparty/cutlass"]
path = 3rdparty/cutlass
url = https://github.com/NVIDIA/cutlass.git
2 changes: 1 addition & 1 deletion 3rdparty/trt_fused_multihead_attention/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ set(trt_fused_multi_head_attention_files
qkvToContext.cu
)

file(GLOB trt_fused_multi_head_attention_files ${trt_fused_multi_head_attention_files} *.sm*.cpp)
file(GLOB trt_fused_multi_head_attention_files ${trt_fused_multi_head_attention_files} *sm*.cpp)

add_library(trt_fused_multi_head_attention STATIC ${trt_fused_multi_head_attention_files})
target_link_libraries(trt_fused_multi_head_attention PUBLIC -lcublas -lcudart)
Expand Down

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

Large diffs are not rendered by default.

10,483 changes: 10,483 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm70.cubin.cpp

Large diffs are not rendered by default.

10,429 changes: 10,429 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm72.cubin.cpp

Large diffs are not rendered by default.

26,632 changes: 26,632 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm75.cubin.cpp

Large diffs are not rendered by default.

22,152 changes: 22,152 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm80.cubin.cpp

Large diffs are not rendered by default.

22,184 changes: 22,184 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm86.cubin.cpp

Large diffs are not rendered by default.

22,184 changes: 22,184 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_32_sm89.cubin.cpp

Large diffs are not rendered by default.

21,075 changes: 21,075 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm70.cubin.cpp

Large diffs are not rendered by default.

33,096 changes: 33,096 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm72.cubin.cpp

Large diffs are not rendered by default.

31,123 changes: 31,123 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm75.cubin.cpp

Large diffs are not rendered by default.

25,555 changes: 25,555 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm80.cubin.cpp

Large diffs are not rendered by default.

25,512 changes: 25,512 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm86.cubin.cpp

Large diffs are not rendered by default.

25,512 changes: 25,512 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_40_sm89.cubin.cpp

Large diffs are not rendered by default.

12,765 changes: 12,765 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_64_sm70.cubin.cpp

Large diffs are not rendered by default.

12,701 changes: 12,701 additions & 0 deletions 3rdparty/trt_fused_multihead_attention/fmha_v2_fp16_Causal_128_64_sm72.cubin.cpp

Large diffs are not rendered by default.

Loading

0 comments on commit 4c014fb

Please sign in to comment.