Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CPU上运行,参考 #42 处理完之后提示找不到“transformers_modules” #171

Closed
1 task done
gongjimin opened this issue Mar 20, 2023 · 2 comments
Closed
1 task done

Comments

@gongjimin
Copy link

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

运行到 tokenizer = AutoTokenizer.from_pretrained("../chatglm", trust_remote_code=True) 的时候提示:

Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\Users\godspeed\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 663, in from_pretrained
    tokenizer_class = get_class_from_dynamic_module(
  File "C:\Users\godspeed\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 399, in get_class_from_dynamic_module
    return get_class_in_module(class_name, final_module.replace(".py", ""))
  File "C:\Users\godspeed\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 177, in get_class_in_module
    module = importlib.import_module(module_path)
  File "C:\Users\godspeed\AppData\Local\Programs\Python\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 992, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'transformers_modules.'

Expected Behavior

No response

Steps To Reproduce

  1. 参考 没有n卡情况下无法运行 ReadMe中的纯cpu模式方案无效 #42 将模型clone到本地之后修改了部分代码
  2. 运行报错

Environment

- OS: Windows 11 21H2
- Python: 3.10.8
- Transformers: 4.27.1
- PyTorch: 2.0.0
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) : 无

Anything else?

No response

@gongjimin
Copy link
Author

我猜测可能是Transformers版本过高导致的。
运行pip uninstall transformers 再运行 pip install transformers==4.26.1 之后,问题解决了。
为了保障后续的兼容,看看是不是需要修改源码?

@duzx16
Copy link
Member

duzx16 commented Mar 21, 2023

谢谢提醒,已经修改了

@duzx16 duzx16 closed this as completed Mar 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants