Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to Transformers v4.48 #1698

Draft
wants to merge 64 commits into
base: main
Choose a base branch
from
Draft

Upgrade to Transformers v4.48 #1698

wants to merge 64 commits into from

Conversation

regisss
Copy link
Collaborator

@regisss regisss commented Jan 15, 2025

What does this PR do?

As per title.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

regisss and others added 30 commits September 2, 2024 16:51
@vidyasiv vidyasiv mentioned this pull request Jan 28, 2025
3 tasks
@regisss
Copy link
Collaborator Author

regisss commented Jan 30, 2025

@regisss v4.48.1 is released. Should we switch to that one? Fixes multiple issues. I noticed LLF upgrade PR also is going directly for this version and throws warning for 4.46.0-4.48.0 versions. https://github.com/hiyouga/LLaMA-Factory/pull/6628/files#diff-78c85f34fa653e9d750cad5a41efa8bda242c096dd06b2fad00e08c25cb48b85R97-R103

Yes, I'll add the changes brought by the patch in the next few days

@regisss
Copy link
Collaborator Author

regisss commented Jan 31, 2025

@yafshar @dsocek This branch is now aligned with Transformers v4.48.2 that got released a few hours ago

@yafshar
Copy link
Contributor

yafshar commented Jan 31, 2025

@yafshar @dsocek This branch is now aligned with Transformers v4.48.2 that got released a few hours ago

Awesome regiss! thanks, I will start do some testing and integration of other work with this branch

@yafshar
Copy link
Contributor

yafshar commented Feb 6, 2025

@regisss can you please cherry pick this huggingface/transformers#35438 to this branch. It is relevant to def training_step(

with self.compute_loss_context_manager():

@regisss
Copy link
Collaborator Author

regisss commented Feb 7, 2025

@regisss can you please cherry pick this huggingface/transformers#35438 to this branch. It is relevant to def training_step(

with self.compute_loss_context_manager():

@yafshar huggingface/transformers#35438 was introduced in v4.48.0 and I cherry-picked later huggingface/transformers#35651 that fixes it. Is there anything specific to huggingface/transformers#35438 that you would like to have here?

@yafshar
Copy link
Contributor

yafshar commented Feb 7, 2025

@regisss can you please cherry pick this huggingface/transformers#35438 to this branch. It is relevant to def training_step(

with self.compute_loss_context_manager():

@yafshar huggingface/transformers#35438 was introduced in v4.48.0 and I cherry-picked later huggingface/transformers#35651 that fixes it. Is there anything specific to huggingface/transformers#35438 that you would like to have here?

Thanks, in that case no need for any change (my mistake). I was working on an older commit and encountered the issue which is fixed in this commit and has been changed later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.