You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
errors:RuntimeError: "LayerNormKernelImpl" not implemented for 'Half',however i use model = ChatGLMForConditionalGeneration.from_pretrained(
"./model",torch_type=torch.float32),error "RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of bFloat "
if i use ChatGLMForConditionalGeneration.from_pretrained(
"./model",torch_type=torch.Bfloat) errors"RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of Float"
how can i run the code correctly?
The text was updated successfully, but these errors were encountered:
errors:RuntimeError: "LayerNormKernelImpl" not implemented for 'Half',however i use model = ChatGLMForConditionalGeneration.from_pretrained( "./model",torch_type=torch.float32),error "RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of bFloat " if i use ChatGLMForConditionalGeneration.from_pretrained( "./model",torch_type=torch.Bfloat) errors"RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of Float" how can i run the code correctly?
What a coincidence! I encoutered the same problem as you. I found a Blog that might be useful to you.
To summary, the reason for this problem should be that the model is not placed on the GPU. Because the model needs to be half() and CPU can't do it, only GPU can.
i can only use gpu,and ran the code below
import torch
from transformers import AutoTokenizer, AutoModel
from modeling_chatglm import ChatGLMForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained(
"./model", trust_remote_code=True)
model = ChatGLMForConditionalGeneration.from_pretrained(
"./model").half()
while True:
a = input("请输入您的问题:(输入q以退出)")
if a.strip() == 'q':
errors:RuntimeError: "LayerNormKernelImpl" not implemented for 'Half',however i use model = ChatGLMForConditionalGeneration.from_pretrained(
"./model",torch_type=torch.float32),error "RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of bFloat "
if i use ChatGLMForConditionalGeneration.from_pretrained(
"./model",torch_type=torch.Bfloat) errors"RuntimeError: mixed dtype (CPU): expect parameter to have scalar type of Float"
how can i run the code correctly?
The text was updated successfully, but these errors were encountered: