-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
运行报错 #55
Comments
我也遇到同样的情况,请问,你解决了么? |
我也遇到了同样的问题 |
我也遇到了同样的问题 大家都解决了嘛 |
看项目的READER.md,里面
|
回滚到这个仓库,依然报这个错误 ValueError: 130001 is not in list |
home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation)
warnings.warn(
Traceback (most recent call last):
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/infer.py", line 12, in <module>
response, history = model.chat(tokenizer, "问题:" + a.strip() + '\n答案:', max_length=256, history=[])
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 1114, in chat
outputs = self.generate(**input_ids, **gen_kwargs)
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 1452, in generate
return self.sample(
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 2465, in sample
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 979, in prepare_inputs_for_generation
mask_position = seq.index(mask_token)
ValueError: 130001 is not in list
The text was updated successfully, but these errors were encountered: