You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I specify bf16 and use_safetensors it should only download bf16 models and not 32 bit. It is working fine for text_encoder and vae but not for transformer.
Message during inference
A mixture of bf16 and non-bf16 filenames will be loaded.
Loaded bf16 filenames:
[transformer/diffusion_pytorch_model.bf16.safetensors, text_encoder/model.bf16-00001-of-00002.safetensors, text_encoder/model.bf16-00002-of-00002.safetensors, vae/diffusion_pytorch_model.bf16.safetensors]
Loaded non-bf16 filenames:
[transformer/diffusion_pytorch_model-00001-of-00002.safetensors, transformer/diffusion_pytorch_model-00002-of-00002.safetensors
If this behavior is not expected, please check your folder structure.
System Info
Not needed as this is huggingface repo setup issue
Describe the bug
Hello @lawrence-cj ,
I am using Sana using diffusers. The issue is applicable for both these repos and maybe for 512/1024 but not tested.
When I specify bf16 and use_safetensors it should only download bf16 models and not 32 bit. It is working fine for text_encoder and vae but not for transformer.
Reproduction
Logs
System Info
Not needed as this is huggingface repo setup issue
Who can help?
@lawrence-cj
The text was updated successfully, but these errors were encountered: