Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLaMAConfig缺少 #782

Closed
2-Ritual opened this issue Jul 25, 2024 · 3 comments
Closed

LLaMAConfig缺少 #782

2-Ritual opened this issue Jul 25, 2024 · 3 comments
Assignees

Comments

@2-Ritual
Copy link

Issue Type

Support

Modules Involved

Documentation/Tutorial/Example

Have you reproduced the bug with SPU HEAD?

Yes

Have you searched existing issues?

Yes

SPU Version

spu 0.9.3

OS Platform and Distribution

openEuler 22.03

Python Version

3.9.9

Compiler Version

fcc 10.3.1

Current Behavior?

按照Flax Llama-7B Example with Puma的流程,在执行:
cd examples/python/ml/flax_llama7b
python flax_llama7b_split.py --model_path dir-to-flax-llama7b-EasyLM --config ./3pc.json
报错No module named 'EasyLM.models.llama.llama_model_splited_transformer'
尝试使用flax_llama7b.py,报错cannot import name 'LLaMAConfig' from 'EasyLM.models.llama.llama_model'

Standalone code to reproduce the issue

print("A bug")

Relevant log output

No response

@anakinxc
Copy link
Collaborator

@Ye-D mind take a look? Thanks

@Ye-D
Copy link
Contributor

Ye-D commented Jul 25, 2024

The EasyLM has updated their library, it seems that the LLaMAConfig is updated as LLaMAConfigurator (https://github.com/young-geng/EasyLM/blob/981a2ed9630f44258a94b6f44dff2b7bd203ae8d/EasyLM/models/llama/llama_model.py#L32C7-L32C24). You can try to use this new version, or you a old version, eg, https://github.com/young-geng/EasyLM/tree/08_31_2023

@anakinxc
Copy link
Collaborator

no activity, close

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants