Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine readme and docker for vLLM and RayServe of LLM microservice #218

Merged
merged 9 commits into from
Jun 20, 2024

Conversation

XinyaoWa
Copy link
Collaborator

Description

Refine readme and docker for vLLM and RayServe of LLM microservice

Issues

n/a.

Type of change

  • Bug fix (non-breaking change which fixes an issue)

Dependencies

n/a.

Tests

n/a.

@XinyaoWa
Copy link
Collaborator Author

@lvliang-intel @yao531441 @xuechendi @Jian-Zhang Hi, I have updated the readme and docker file for vllm and ray-service to fix some bugs, please help to have a review. Thanks a lot!

@lvliang-intel lvliang-intel merged commit 23e6ed0 into opea-project:main Jun 20, 2024
8 checks passed
sharanshirodkar7 pushed a commit to sharanshirodkar7/GenAIComps that referenced this pull request Jul 9, 2024
…pea-project#218)

* refine readme and docker for vllm and rayserve

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add vllm gaudi performance

Signed-off-by: Xinyao Wang <[email protected]>

* update ray serve port in readme

Signed-off-by: Xinyao Wang <[email protected]>

* update port in vllm readme

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Xinyao Wang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: sharanshirodkar7 <[email protected]>
yogeshmpandey pushed a commit to yogeshmpandey/GenAIComps that referenced this pull request Jul 10, 2024
…pea-project#218)

* refine readme and docker for vllm and rayserve

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add vllm gaudi performance

Signed-off-by: Xinyao Wang <[email protected]>

* update ray serve port in readme

Signed-off-by: Xinyao Wang <[email protected]>

* update port in vllm readme

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Xinyao Wang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: Yogesh Pandey <[email protected]>
dwhitena pushed a commit to predictionguard/GenAIComps that referenced this pull request Jul 24, 2024
…pea-project#218)

* refine readme and docker for vllm and rayserve

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add vllm gaudi performance

Signed-off-by: Xinyao Wang <[email protected]>

* update ray serve port in readme

Signed-off-by: Xinyao Wang <[email protected]>

* update port in vllm readme

Signed-off-by: Xinyao Wang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Xinyao Wang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: Daniel Whitenack <[email protected]>
lkk12014402 pushed a commit that referenced this pull request Aug 8, 2024
@XinyaoWa XinyaoWa deleted the llm branch November 7, 2024 03:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants