Skip to content

Conversation

22dimensions
Copy link
Collaborator

@22dimensions 22dimensions commented Sep 5, 2025

What this PR does / why we need it?

quantization patch is unused code

Does this PR introduce any user-facing change?

No

How was this patch tested?

tested by CI

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request provides a good cleanup by removing the unused quantization patching mechanism. The refactoring simplifies the codebase by deleting the func_wrapper.py file, its corresponding tests, and the complex monkey-patching logic in utils.py. Moving the necessary functionality from the patch directly into the AscendVocabParallelEmbedding class is a solid improvement for maintainability. I have one suggestion to complete the cleanup.

Copy link

github-actions bot commented Sep 5, 2025

👋 Hi! Thank you for contributing to the vLLM Ascend project. The following points will speed up your PR merge:‌‌

  • A PR should do only one thing, smaller PRs enable faster reviews.
  • Every PR should include unit tests and end-to-end tests ‌to ensure it works and is not broken by other future PRs.
  • Write the commit message by fulfilling the PR description to help reviewer and future developers understand.

If CI fails, you can run linting and testing checks locally according Contributing and Testing.

Copy link

codecov bot commented Sep 5, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 72.82%. Comparing base (dd087ef) to head (b052105).
⚠️ Report is 3 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2785      +/-   ##
==========================================
+ Coverage   72.61%   72.82%   +0.20%     
==========================================
  Files         154      152       -2     
  Lines       21318    21077     -241     
==========================================
- Hits        15481    15349     -132     
+ Misses       5837     5728     -109     
Flag Coverage Δ
unittests 72.82% <100.00%> (+0.20%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Signed-off-by: 22dimensions <[email protected]>
@wangxiyuan wangxiyuan merged commit d51694a into vllm-project:main Sep 8, 2025
23 of 25 checks passed
1Fire4 pushed a commit to 1Fire4/vllm-ascend that referenced this pull request Sep 8, 2025
…2785)

### What this PR does / why we need it?
quantization patch is unused code

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
tested by CI

- vLLM version: v0.10.1.1
- vLLM main:
vllm-project/vllm@f4962a6

Signed-off-by: 22dimensions <[email protected]>
Signed-off-by: 1Fire4 <[email protected]>
1Fire4 pushed a commit to 1Fire4/vllm-ascend that referenced this pull request Sep 9, 2025
…2785)

### What this PR does / why we need it?
quantization patch is unused code

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
tested by CI

- vLLM version: v0.10.1.1
- vLLM main:
vllm-project/vllm@f4962a6

Signed-off-by: 22dimensions <[email protected]>
Signed-off-by: 1Fire4 <[email protected]>
Angazenn pushed a commit to Angazenn/vllm-ascend that referenced this pull request Sep 10, 2025
…2785)

### What this PR does / why we need it?
quantization patch is unused code

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
tested by CI

- vLLM version: v0.10.1.1
- vLLM main:
vllm-project/vllm@f4962a6

Signed-off-by: 22dimensions <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants