Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Simplify the
attention
function (#2609)
* Simplify the `attention` function - Use one definition rather than multiple. - Add `key`/`value` arguments, so that we don't need the `PREFILL_IN_KVCACHE` constant. - Make it kwargs-only (to avoid mixing up the various `Tensor` args). * Fixup flashinfer support
- Loading branch information