-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed: checkpoint has shape (1, 1, 1, 512) which is incompatible with the model shape (512,) #99
Comments
Good day, we detected this same issue/ bug today. |
I'm not really sure where it comes from. |
@borisdayma @Mahyar-Ali https://huggingface.co/transformers/_modules/transformers/modeling_flax_utils.html in this area: |
Still getting the same issue here. Hoping for a fix. |
There's a suggested fix in the linked |
This looks like an issue with Flax (huggingface/transformers#14215)
The issue is with To fix this, we could set the min flax version to |
For people having this issue, a temporary fix is to install a previous version of flax: |
I didn't realize this when doing the refactor but its a feature not a bug. The shape is now consistent with other normalization layers and avoids unnecessary constraining the inputs rank for GroupNorm layers. I'm sorry for the checkpoint inconsistency, those are very annoying. This particular one can usually be resolved with a |
Thanks @jheek for the quick answer! No problem, we can update our checkpoint. @patil-suraj , do we need to update anything in your repo or can I just recreate a new checkpoint with the most recent flax version? |
Yes, need to update the flax version in |
This has been fixed, you can use It's different from the original we use in the inference notebook because that one was fine-tuned on other images. |
When trying to load the VQModel using
from_pretrained
, it fails and generates an error message.The text was updated successfully, but these errors were encountered: