Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemma2 - 27B model with JAX/Flax implementation #115

Open
mini-goel opened this issue Feb 18, 2025 · 2 comments
Open

Gemma2 - 27B model with JAX/Flax implementation #115

mini-goel opened this issue Feb 18, 2025 · 2 comments

Comments

@mini-goel
Copy link

Hello, is there a Gemma 2 - 27B model with JAX/Flax implementation available which is fine-tunable. The kaggle link has one but it mentions that it's 'not' fine-tunable. https://www.kaggle.com/models/google/gemma-2/flax

@Gopi-Uppari
Copy link

Hi @mini-goel,

There is no official JAX/Flax implementation for fine-tuning Gemma 2 - 27B yet. However, there are existing resources on fine-tuning smaller Gemma models using JAX and Flax. These can be really useful for understanding the fine-tuning process and could potentially be adapted for larger models in the future.

  1. Fine-tuning Gemma using JAX and Flax: To fine-tune the Gemma 2B Instruct model for tasks like English-French translation using the gemma library alongside JAX and Flax.

  2. Fine-tuning RecurrentGemma using JAX and Flax: his guide focuses on the RecurrentGemma 2B Instruct model, providing insights into fine-tuning with JAX and Flax for similar translation tasks.

Thank you.

@Conchylicultor
Copy link
Collaborator

Hey @mini-goel , yes, this repository is the official implementation for fine-tuning Gemma 2. We have a few examples:

Let us know if something isn't working for you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants