We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
from transformers import AutoTokenizer, AutoModel model_med = AutoModel.from_pretrained("./chatglm-6b-med/", trust_remote_code=True)
File ~/.cache/huggingface/modules/transformers_modules/modeling_chatglm.py:818, in ChatGLMModel.init(self, config, empty_init) 816 self.hidden_size_per_attention_head = self.hidden_size // self.num_attention_heads 817 self.position_encoding_2d = config.position_encoding_2d --> 818 self.pre_seq_len = config.pre_seq_len 819 self.prefix_projection = config.prefix_projection 821 self.word_embeddings = init_method( 822 torch.nn.Embedding, 823 num_embeddings=self.vocab_size, embedding_dim=self.hidden_size, 824 dtype=self.params_dtype 825 )
File /opt/conda/envs/xs_llm/lib/python3.8/site-packages/transformers/configuration_utils.py:260, in PretrainedConfig.getattribute(self, key) 258 if key != "attribute_map" and key in super().getattribute("attribute_map"): 259 key = super().getattribute("attribute_map")[key] --> 260 return super().getattribute(key)
AttributeError: 'ChatGLMConfig' object has no attribute 'pre_seq_len'
The text was updated successfully, but these errors were encountered:
No branches or pull requests
from transformers import AutoTokenizer, AutoModel
model_med = AutoModel.from_pretrained("./chatglm-6b-med/", trust_remote_code=True)
File ~/.cache/huggingface/modules/transformers_modules/modeling_chatglm.py:818, in ChatGLMModel.init(self, config, empty_init)
816 self.hidden_size_per_attention_head = self.hidden_size // self.num_attention_heads
817 self.position_encoding_2d = config.position_encoding_2d
--> 818 self.pre_seq_len = config.pre_seq_len
819 self.prefix_projection = config.prefix_projection
821 self.word_embeddings = init_method(
822 torch.nn.Embedding,
823 num_embeddings=self.vocab_size, embedding_dim=self.hidden_size,
824 dtype=self.params_dtype
825 )
File /opt/conda/envs/xs_llm/lib/python3.8/site-packages/transformers/configuration_utils.py:260, in PretrainedConfig.getattribute(self, key)
258 if key != "attribute_map" and key in super().getattribute("attribute_map"):
259 key = super().getattribute("attribute_map")[key]
--> 260 return super().getattribute(key)
AttributeError: 'ChatGLMConfig' object has no attribute 'pre_seq_len'
The text was updated successfully, but these errors were encountered: