You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, is there a Gemma 2 - 27B model with JAX/Flax implementation available which is fine-tunable. The kaggle link has one but it mentions that it's 'not' fine-tunable. https://www.kaggle.com/models/google/gemma-2/flax
The text was updated successfully, but these errors were encountered:
There is no official JAX/Flax implementation for fine-tuning Gemma 2 - 27B yet. However, there are existing resources on fine-tuning smaller Gemma models using JAX and Flax. These can be really useful for understanding the fine-tuning process and could potentially be adapted for larger models in the future.
Fine-tuning Gemma using JAX and Flax: To fine-tune the Gemma 2B Instruct model for tasks like English-French translation using the gemma library alongside JAX and Flax.
Fine-tuning RecurrentGemma using JAX and Flax: his guide focuses on the RecurrentGemma 2B Instruct model, providing insights into fine-tuning with JAX and Flax for similar translation tasks.
Hello, is there a Gemma 2 - 27B model with JAX/Flax implementation available which is fine-tunable. The kaggle link has one but it mentions that it's 'not' fine-tunable. https://www.kaggle.com/models/google/gemma-2/flax
The text was updated successfully, but these errors were encountered: