Skip to content

Conversation

timmoon10
Copy link
Collaborator

Description

#2012 added support for attention with FP8 current scaling, with a minimum requirement of cuDNN 9.13.1. However, I have experienced some correctness errors, so this PR bumps up the minimum version to cuDNN 9.14.0.

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refactoring

Changes

  • Require cuDNN 9.14.0+ for fused attention with FP8 current scaling

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@timmoon10 timmoon10 requested a review from cyanguwa October 4, 2025 02:03
@timmoon10 timmoon10 added the bug Something isn't working label Oct 4, 2025
@timmoon10
Copy link
Collaborator Author

/te-ci pytorch

@timmoon10 timmoon10 changed the title [PyTorch] Bump minimum cuDNN version fused attention with FP8 current scaling [PyTorch] Bump minimum cuDNN version for fused attention with FP8 current scaling Oct 4, 2025
@ksivaman
Copy link
Member

ksivaman commented Oct 9, 2025

/te-ci pytorch

Copy link
Member

@ksivaman ksivaman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, what is the numerical bug?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants