-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bert training update #781
Bert training update #781
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, Thanks !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that you generate the tutorial from the notebook. I think it is a great approach because it saves a lot of work. I have one small nit about it though: is it possible to not have the commands in the docs start by !
? I know it's needed in the notebook to run bash commands, but I does not really make sense for the tutorial, wdty?
@michaelbenayoun I do not have a strong opinion towards/against the |
Since the tutorial is available in doc and notebook format, keep only theh notebook, that can converted to md.
Otherwise the tutorial fails.
- Some links were wrong, they have been updated - Some rewording was necessary - The training script has been updated and simplified
- Transformers' Trainer was used in this example instead of NeuronTrainer, because otherwise an error occurs preventing to succeed. - Output is updated accordingly.
This tutorial does pretty much the same as the SFT LoRA tutorial, so it seems a bit redundant. It is hence removed.
9b65363
to
aea15dc
Compare
force-pushed ot resolve conflicts. |
What does this PR do?
In this PR the BERT fine-tuning tutorial is updated to make it actually working and provide a reproducible result.
In the process, the text and script have been updated to reflect the necessary changes.
The notebook now will produce the documentation.
Finally, the old llama tutorial has been removed since redundant.
Before submitting