Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explain --max-gpu-memory parameter #2202

Merged
merged 1 commit into from
Aug 13, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,9 +112,11 @@ python3 -m fastchat.serve.cli --model-path lmsys/vicuna-7b-v1.3
```

#### Multiple GPUs
You can use model parallelism to aggregate GPU memory from multiple GPUs on the same machine.
You can use model parallelism to aggregate GPU memory from multiple GPUs on the same machine.
It says that when you do not specify the argument max_gpu_memory, the kwargs['device_map'] will be set to sequential, instead of the wanted auto. So you can try adding the argument, Use `--max-gpu-memory "10GiB"`. when you set this argument, maybe you can solve the problem of 'out of memory' caused by loading large amounts of data.
`Tips`: Remember to set the --max-gpu-memory parameter, because when I use five 32G graphics cards to load 13B-16K and 33B models, it will always give priority to loading the front graphics card memory to full, which leads to the Sometimes, an error will always be reported, and then out of memory will be displayed. When I added this parameter and set the memory to 20Gib, it was finally normal. Of course, you need to set a reasonable size according to the actual memory of your own single graphics card.
```
python3 -m fastchat.serve.cli --model-path lmsys/vicuna-7b-v1.3 --num-gpus 2
python3 -m fastchat.serve.cli --model-path lmsys/vicuna-7b-v1.3 --num-gpus 2 --max-gpu-memory "10GiB"
```

#### CPU Only
Expand Down