Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add deepseek doc #99

Merged
merged 1 commit into from
Jun 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 35 additions & 0 deletions docs/guidebook/en/3_1_2_DeepSeek_LLM_Use.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# DeepSeek Usage
## 1. Create the relevant file.
Create a YAML file, for example, user_deepseek_llm.yaml
Paste the following content into your user_deepseek_llm.yaml file.
```yaml
name: 'user_deepseek_llm'
description: 'default default_deepseek_llm llm with spi'
model_name: 'deepseek-chat'
max_tokens: 1000
metadata:
type: 'LLM'
module: 'agentuniverse.llm.default.deep_seek_openai_style_llm'
class: 'DefaultDeepSeekLLM'
```
## 2. Environment Setup
Must be configured: DEEPSEEK_API_KEY
Optional: DEEPSEEK_API_BASE
### 2.1 Configure through Python code
```python
import os
os.environ['DEEPSEEK_API_KEY'] = 'sk-***'
os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx'
```
### 2.2 Configure through the configuration file
In the custom_key.toml file under the config directory of the project, add the configuration:
```toml
DEEPSEEK_API_KEY="sk-******"
DEEPSEEK_API_BASE="https://xxxxxx"
```
## 3. Obtaining the DeepSeek API KEY
Please refer to the official documentation of DEEPSEEK:https://platform.deepseek.com/api_keys

## 4. Tips
In the agentuniverse, we have established an LLM named default_deepseek_llm. Users can directly utilize it after configuring their DEEPSEEK_API_KEY.

37 changes: 37 additions & 0 deletions docs/guidebook/en/3_1_2_OpenAIStyleLLM_Use.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# OpenAIStyleLLM Usage
## 1. Create the relevant file.
Create a YAML file, for example, user_openai_style_llm.yaml
Paste the following content into your user_openai_style_llm.yaml file.
```yaml
name: 'deep_seek_llm'
description: 'demo deep_seek llm with spi'
model_name: 'deepseek-chat'
max_tokens: 4000
max_context_length: 32000
api_key_env: 'DEEPSEEK_API_KEY'
api_base_env: 'DEEPSEEK_API_BASE'
metadata:
type: 'LLM'
module: 'agentuniverse.llm.openai_style_llm'
class: 'OpenAIStyleLLM'
```
Parameter Description:
max_context_length: The context length that the model can handle.
api_key_env: The name of the environment variable for the API key.
api_base_env: The name of the environment variable for the API base.
These parameters must be configured, and upon service startup, the corresponding values will be loaded from the environment variables.
## 2. Environment Setup
Mandatory Configuration: $api_key_env, $api_base_env.
In the configuration file, api_key_env and api_base_env are set as DEEPSEEK_API_KEY and DEEPSEEK_API_BASE respectively. Therefore, the configuration should be as follows:
### 2.1 Configure through Python code
```python
import os
os.environ['DEEPSEEK_API_KEY'] = 'sk-***'
os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx'
```
### 2.2 Configure through the configuration file
In the custom_key.toml file under the config directory of the project, add the configuration:
```toml
DEEPSEEK_API_KEY="DEEPSEEK_API_KEY"
DEEPSEEK_API_BASE="https://xxxxxx"
```
35 changes: 35 additions & 0 deletions docs/guidebook/zh/3_1_2_DeepSeek使用.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# DeepSeek 使用
## 1. 创建相关文件
创建一个yaml文件,例如 user_deepseek_llm.yaml
将以下内容粘贴到您的user_deepseek_llm.yaml文件当中
```yaml
name: 'user_deepseek_llm'
description: 'default default_deepseek_llm llm with spi'
model_name: 'deepseek-chat'
max_tokens: 1000
metadata:
type: 'LLM'
module: 'agentuniverse.llm.default.deep_seek_openai_style_llm'
class: 'DefaultDeepSeekLLM'
```
## 2. 环境设置
### 2.1 通过python代码配置
必须配置:DEEPSEEK_API_KEY
可选配置:DEEPSEEK_API_BASE
```python
import os
os.environ['DEEPSEEK_API_KEY'] = 'sk-***'
os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx'
```
### 2.2 通过配置文件配置
在项目的config目录下的custom_key.toml当中,添加配置:
```toml
DEEPSEEK_API_KEY="sk-******"
DEEPSEEK_API_BASE="https://xxxxxx"
```
## 3. DEEPSEEK API KEY 获取
参考 DEEPSEEK 官方文档:https://platform.deepseek.com/api_keys

## 4. Tips
在agentuniverse中,我们已经创建了一个name为default_deepseek_llm的llm,用户在配置DEEPSEEK_API_KEY之后可以直接使用。

39 changes: 39 additions & 0 deletions docs/guidebook/zh/3_1_2_OpenAIStyleLLM使用.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# OpenAIStyleLLM 使用
通过该配置,您可以连接任何遵守OpenAI标准的模型服务
## 1. 创建相关文件
创建一个yaml文件,例如 user_openai_style_llm.yaml
将以下内容粘贴到您的user_openai_style_llm.yaml文件当中
```yaml
name: 'deep_seek_llm'
description: 'demo deep_seek llm with spi'
model_name: 'deepseek-chat'
max_tokens: 4000
max_context_length: 32000
api_key_env: 'DEEPSEEK_API_KEY'
api_base_env: 'DEEPSEEK_API_BASE'
metadata:
type: 'LLM'
module: 'agentuniverse.llm.openai_style_llm'
class: 'OpenAIStyleLLM'
```
参数说明:
max_context_length: 模型所能承接的上下文
api_key_env: api key的env变量名
api_base_env: api base的env变量名
上述参数必须配置,服务启动时会从环境变量中加载相关的值
## 2. 环境设置
必须配置:$api_key_env, $api_base_env
配置文件中api_key_env与api_base_env分别为DEEPSEEK_API_KEY、DEEPSEEK_API_BASE,所以配置如下:
### 2.1 通过python代码配置
```python
import os
os.environ['DEEPSEEK_API_KEY'] = 'sk-***'
os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx'
```
### 2.2 通过配置文件配置
在项目的config目录下的custom_key.toml当中,添加配置:
```toml
DEEPSEEK_API_KEY="DEEPSEEK_API_KEY"
DEEPSEEK_API_BASE="https://xxxxxx"
```
注意,这里必须与配置文件中锁配置的api_key_env、api_base_env一致