Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correctly drop tokens in SwitchTransformer #37123

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

mario-aws
Copy link

@mario-aws mario-aws commented Mar 31, 2025

What does this PR do?

Previously, the identity function was used for dropped tokens with a weight from the expert that was not applied to the hidden states. This was misleading, because dropping means, the expert weight is zero. Instead of trying to fix the weight, we take an easier approach by initializing with zeros.

Fixes #37017

Related work

https://github.com/tensorflow/mesh/blob/e6798a2610a2c2f4c4cd236d8214422cb1ecc00a/mesh_tensorflow/transformer/moe.py#L1144 mentions that it needs to be zeroed out.

https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/moe.py#L507C18-L507C31 combines the results without any clone initialization beforehand.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

text models: @ArthurZucker
last major changing person: @zucchini-nlp
person who requested PR: @Rocketknight1

Previously, the identity function was used for dropped tokens
with a weight from the expert that was not applied to the hidden states.
This was misleading, because dropping means, the expert weight is zero.
Instead of trying to fix the weight, we take an easier approach by initializing with zeros.

Fixes issue huggingface#37017
@github-actions github-actions bot marked this pull request as draft March 31, 2025 03:54
Copy link

Hi 👋, thank you for opening this pull request! The pull request is converted to draft by default. The CI will be paused while the PR is in draft mode. When it is ready for review, please click the Ready for review button (at the bottom of the PR page). This will assign reviewers and trigger CI.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@marthos1
Copy link

marthos1 commented Apr 2, 2025

Would love to be part of this project one day.
I'm not a coder, but I do have a vision... :)

@mario-aws
Copy link
Author

@Rocketknight1

@Rocketknight1
Copy link
Member

LGTM but I'm not an expert on MoE routing! @zucchini-nlp @ArthurZucker if you're happy with it feel free to merge

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants