Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

没有n卡情况下无法运行 ReadMe中的纯cpu模式方案无效 #42

Closed
coderstory opened this issue Mar 15, 2023 · 3 comments
Closed

Comments

@coderstory
Copy link

PS D:\ppp\ChatGLM-6B> python web_demo.py
Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a `revision` is encouraged when loading a configuration with custom code to ensure no malicious code has been contributed in a newer revision.
Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
  File "D:\ppp\ChatGLM-6B\web_demo.py", line 5, in <module>
    model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).float()
  File "C:\Program Files\Python310\lib\site-packages\transformers\models\auto\auto_factory.py", line 455, in from_pretrained
    model_class = get_class_from_dynamic_module(
  File "C:\Program Files\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 363, in get_class_from_dynamic_module
    final_module = get_cached_module_file(
  File "C:\Program Files\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 274, in get_cached_module_file
    get_cached_module_file(
  File "C:\Program Files\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 237, in get_cached_module_file
    modules_needed = check_imports(resolved_module_file)
  File "C:\Program Files\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 129, in check_imports
    importlib.import_module(imp)
  File "C:\Program Files\Python310\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "C:\Program Files\Python310\lib\site-packages\cpm_kernels\__init__.py", line 1, in <module>
    from . import library
  File "C:\Program Files\Python310\lib\site-packages\cpm_kernels\library\__init__.py", line 2, in <module>
    from . import cuda
  File "C:\Program Files\Python310\lib\site-packages\cpm_kernels\library\cuda.py", line 7, in <module>
    cuda = Lib.from_lib("cuda", ctypes.WinDLL("nvcuda.dll"))
  File "C:\Program Files\Python310\lib\ctypes\__init__.py", line 374, in __init__
    self._handle = _dlopen(self._name, mode)
FileNotFoundError: Could not find module 'nvcuda.dll' (or one of its dependencies). Try using the full path with constructor syntax.

按readme改成 model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).float()
还是报错

@coderstory
Copy link
Author

翻了一下此方法是 #4 衍生出来的,看回复 这个方案是有问题的

@coderstory coderstory changed the title 没有n卡情况下无法运行 没有n卡情况下无法运行 ReadMe中的纯cpu模式方案无效 Mar 15, 2023
@yaleimeng
Copy link

6B 这么大的模型,别折磨自己了。。
可以在huggingface或colab 等平台执行一些测试。

@duzx16
Copy link
Member

duzx16 commented Mar 15, 2023

类似于#6 (comment)
先把仓库clone到本地再运行就可以解决这个问题。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants