Fix torch.nonzero inconsistency and n-gram range bug in evaluation.py #715
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
This PR addresses inconsistencies in PyTorch tensor operations and fixes a critical bug in the n-gram evaluation functionality that was causing incorrect predictions.
Issues Fixed
1. torch.nonzero() Inconsistency
The
ngram()
andupdate_ngram_scores()
functions used different approaches to extract non-zero indices:Both approaches produce identical results, but the second is more direct and efficient. This PR standardizes on the cleaner approach.
2. Critical N-gram Range Bug
The
ngram()
function had an off-by-one error in the n-gram sequence processing loop:This bug caused the function to skip the last possible n-gram sequence, potentially leading to incorrect classification predictions.
3. Modernized Tensor Operations
Updated deprecated tensor size access pattern:
Testing
Impact
The changes are minimal and surgical, maintaining full backward compatibility while improving correctness and efficiency.
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.