Skip to content

Commit 708d66c

Browse files
committed
fix
1 parent 335b350 commit 708d66c

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

python/sgl_jax/srt/layers/attention/flash_attn_kernel/flash_attention.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -332,6 +332,7 @@ def _async_copy(src, dst, sem, wait):
332332
cp.start()
333333

334334
def _fetch_mask(seq_idx, bq_idx, bkvmask_idx, bkvmask_sem_idx, *, wait=False):
335+
assert False, f"@@@@@@@@@@@ {sem.dtype=} {sem.shape=}"
335336
sem = sems.at[4, bkvmask_sem_idx]
336337
kvmask_fused_vmem_ref = bkvmask_ref.at[bkvmask_sem_idx]
337338

0 commit comments

Comments
 (0)