diff --git a/docs/guidebook/en/3_1_2_DeepSeek_LLM_Use.md b/docs/guidebook/en/3_1_2_DeepSeek_LLM_Use.md new file mode 100644 index 00000000..99622433 --- /dev/null +++ b/docs/guidebook/en/3_1_2_DeepSeek_LLM_Use.md @@ -0,0 +1,35 @@ +# DeepSeek Usage +## 1. Create the relevant file. +Create a YAML file, for example, user_deepseek_llm.yaml +Paste the following content into your user_deepseek_llm.yaml file. +```yaml +name: 'user_deepseek_llm' +description: 'default default_deepseek_llm llm with spi' +model_name: 'deepseek-chat' +max_tokens: 1000 +metadata: + type: 'LLM' + module: 'agentuniverse.llm.default.deep_seek_openai_style_llm' + class: 'DefaultDeepSeekLLM' +``` +## 2. Environment Setup +Must be configured: DEEPSEEK_API_KEY +Optional: DEEPSEEK_API_BASE +### 2.1 Configure through Python code +```python +import os +os.environ['DEEPSEEK_API_KEY'] = 'sk-***' +os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx' +``` +### 2.2 Configure through the configuration file +In the custom_key.toml file under the config directory of the project, add the configuration: +```toml +DEEPSEEK_API_KEY="sk-******" +DEEPSEEK_API_BASE="https://xxxxxx" +``` +## 3. Obtaining the DeepSeek API KEY +Please refer to the official documentation of DEEPSEEK:https://platform.deepseek.com/api_keys + +## 4. Tips +In the agentuniverse, we have established an LLM named default_deepseek_llm. Users can directly utilize it after configuring their DEEPSEEK_API_KEY. + diff --git a/docs/guidebook/en/3_1_2_OpenAIStyleLLM_Use.md b/docs/guidebook/en/3_1_2_OpenAIStyleLLM_Use.md new file mode 100644 index 00000000..ab712fc3 --- /dev/null +++ b/docs/guidebook/en/3_1_2_OpenAIStyleLLM_Use.md @@ -0,0 +1,37 @@ +# OpenAIStyleLLM Usage +## 1. Create the relevant file. +Create a YAML file, for example, user_openai_style_llm.yaml +Paste the following content into your user_openai_style_llm.yaml file. +```yaml +name: 'deep_seek_llm' +description: 'demo deep_seek llm with spi' +model_name: 'deepseek-chat' +max_tokens: 4000 +max_context_length: 32000 +api_key_env: 'DEEPSEEK_API_KEY' +api_base_env: 'DEEPSEEK_API_BASE' +metadata: + type: 'LLM' + module: 'agentuniverse.llm.openai_style_llm' + class: 'OpenAIStyleLLM' +``` +Parameter Description: +max_context_length: The context length that the model can handle. +api_key_env: The name of the environment variable for the API key. +api_base_env: The name of the environment variable for the API base. +These parameters must be configured, and upon service startup, the corresponding values will be loaded from the environment variables. +## 2. Environment Setup +Mandatory Configuration: $api_key_env, $api_base_env. +In the configuration file, api_key_env and api_base_env are set as DEEPSEEK_API_KEY and DEEPSEEK_API_BASE respectively. Therefore, the configuration should be as follows: +### 2.1 Configure through Python code +```python +import os +os.environ['DEEPSEEK_API_KEY'] = 'sk-***' +os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx' +``` +### 2.2 Configure through the configuration file +In the custom_key.toml file under the config directory of the project, add the configuration: +```toml +DEEPSEEK_API_KEY="DEEPSEEK_API_KEY" +DEEPSEEK_API_BASE="https://xxxxxx" +``` diff --git "a/docs/guidebook/zh/3_1_2_DeepSeek\344\275\277\347\224\250.md" "b/docs/guidebook/zh/3_1_2_DeepSeek\344\275\277\347\224\250.md" new file mode 100644 index 00000000..7a243ed2 --- /dev/null +++ "b/docs/guidebook/zh/3_1_2_DeepSeek\344\275\277\347\224\250.md" @@ -0,0 +1,35 @@ +# DeepSeek 使用 +## 1. 创建相关文件 +创建一个yaml文件,例如 user_deepseek_llm.yaml +将以下内容粘贴到您的user_deepseek_llm.yaml文件当中 +```yaml +name: 'user_deepseek_llm' +description: 'default default_deepseek_llm llm with spi' +model_name: 'deepseek-chat' +max_tokens: 1000 +metadata: + type: 'LLM' + module: 'agentuniverse.llm.default.deep_seek_openai_style_llm' + class: 'DefaultDeepSeekLLM' +``` +## 2. 环境设置 +### 2.1 通过python代码配置 +必须配置:DEEPSEEK_API_KEY +可选配置:DEEPSEEK_API_BASE +```python +import os +os.environ['DEEPSEEK_API_KEY'] = 'sk-***' +os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx' +``` +### 2.2 通过配置文件配置 +在项目的config目录下的custom_key.toml当中,添加配置: +```toml +DEEPSEEK_API_KEY="sk-******" +DEEPSEEK_API_BASE="https://xxxxxx" +``` +## 3. DEEPSEEK API KEY 获取 +参考 DEEPSEEK 官方文档:https://platform.deepseek.com/api_keys + +## 4. Tips +在agentuniverse中,我们已经创建了一个name为default_deepseek_llm的llm,用户在配置DEEPSEEK_API_KEY之后可以直接使用。 + diff --git "a/docs/guidebook/zh/3_1_2_OpenAIStyleLLM\344\275\277\347\224\250.md" "b/docs/guidebook/zh/3_1_2_OpenAIStyleLLM\344\275\277\347\224\250.md" new file mode 100644 index 00000000..e2078d9c --- /dev/null +++ "b/docs/guidebook/zh/3_1_2_OpenAIStyleLLM\344\275\277\347\224\250.md" @@ -0,0 +1,39 @@ +# OpenAIStyleLLM 使用 +通过该配置,您可以连接任何遵守OpenAI标准的模型服务 +## 1. 创建相关文件 +创建一个yaml文件,例如 user_openai_style_llm.yaml +将以下内容粘贴到您的user_openai_style_llm.yaml文件当中 +```yaml +name: 'deep_seek_llm' +description: 'demo deep_seek llm with spi' +model_name: 'deepseek-chat' +max_tokens: 4000 +max_context_length: 32000 +api_key_env: 'DEEPSEEK_API_KEY' +api_base_env: 'DEEPSEEK_API_BASE' +metadata: + type: 'LLM' + module: 'agentuniverse.llm.openai_style_llm' + class: 'OpenAIStyleLLM' +``` +参数说明: + max_context_length: 模型所能承接的上下文 + api_key_env: api key的env变量名 + api_base_env: api base的env变量名 +上述参数必须配置,服务启动时会从环境变量中加载相关的值 +## 2. 环境设置 +必须配置:$api_key_env, $api_base_env +配置文件中api_key_env与api_base_env分别为DEEPSEEK_API_KEY、DEEPSEEK_API_BASE,所以配置如下: +### 2.1 通过python代码配置 +```python +import os +os.environ['DEEPSEEK_API_KEY'] = 'sk-***' +os.environ['DEEPSEEK_API_BASE'] = 'https://xxxxxx' +``` +### 2.2 通过配置文件配置 +在项目的config目录下的custom_key.toml当中,添加配置: +```toml +DEEPSEEK_API_KEY="DEEPSEEK_API_KEY" +DEEPSEEK_API_BASE="https://xxxxxx" +``` +注意,这里必须与配置文件中锁配置的api_key_env、api_base_env一致