Skip to content

regression test #234

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 15, 2025
Merged

regression test #234

merged 1 commit into from
Jul 15, 2025

Conversation

tushar00jain
Copy link
Contributor

@tushar00jain tushar00jain commented Jul 14, 2025

Summary:

  • add a test that uses fixtures to validate against previous runs of the test
  • the fixutres can be written using WRITE_FIXTURE=true pytest -vs ./torchft/test_diloco_mocked_updates.py
  • when writing fixtures, the test also numerically validates the implementation of streaming diloco

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Jul 14, 2025
@tushar00jain tushar00jain requested review from d4l3k and H-Huang July 14, 2025 17:49
@tushar00jain tushar00jain force-pushed the pr234 branch 3 times, most recently from b58a3de to 5e73cc3 Compare July 14, 2025 18:42
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we move these to a /test/* dir? It might get real annoying to grep through torchft/ if there's a bunch of testing specific autogenned files

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah I was thinking top level not under the torchft dir

I often find stuff by just running rg foo torchft so was hoping to have it excluded

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • moved to /test_fixtures
  • keep the test file with the implementation file
  • renamed test folder to _test

"0": {
"layers.0.weight": [
[
1.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it expected that all of these are whole numbers? I would expect the weights to be a lot more varied?

Copy link
Contributor Author

@tushar00jain tushar00jain Jul 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah i'm optimizing for human readability e.g. i was able to run the numerics by hand and validate the output in these files myself. this helped in writing the validation logic and it caught issues that i fixed in the previous diff. also using floats might need us to do some closeness check because of precision issues (different machines might do things differently). i think we do get full validation of the logic using just whole numbers though?

@tushar00jain tushar00jain force-pushed the pr234 branch 6 times, most recently from 9378ece to d5f0b32 Compare July 15, 2025 17:38
Copy link
Member

@d4l3k d4l3k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Summary:
- add a test that uses fixtures to validate against previous runs of the test
- the fixutres can be written using `WRITE_FIXTURE=true pytest -vs ./torchft/test_diloco_mocked_updates.py`
- when writing fixtures, the test also numerically validates the implementation of streaming diloco
@tushar00jain tushar00jain merged commit 3edb84e into pytorch:main Jul 15, 2025
15 checks passed
@tushar00jain tushar00jain deleted the pr234 branch July 15, 2025 22:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants