From 3c78089c39e957041495aceb5ab61198c838b093 Mon Sep 17 00:00:00 2001 From: hakotesova <148503796+hakotesova@users.noreply.github.com> Date: Fri, 3 Jan 2025 09:11:11 -0800 Subject: [PATCH] Update distillation documentation with phi3 offerings (#3461) * remove unnecessary files * update ai studio to ai foundry * remove phi3 medium from offering list * update cli readme with phi3 models --- .../distillation/conversation/README.md | 1 + .../system/distillation/math/README.md | 1 + .../system/distillation/nli/README.md | 1 + .../system/distillation/nlu_qa/README.md | 1 + .../distillation/summarization/README.md | 1 + .../system/distillation/README.md | 13 ++++++++++++ .../distillation_conversational_task.ipynb | 20 ++++++++++++------- .../distillation/math/distillation_math.ipynb | 20 ++++++++++++------- .../nli/distillation_chat_completion.ipynb | 20 ++++++++++++------- .../nlu_qa/distillation_nlu_qa_task.ipynb | 20 ++++++++++++------- .../nlu_qa/distillation_qa_math.ipynb | 20 ++++++++++++------- .../distillation_summarization.ipynb | 20 ++++++++++++------- 12 files changed, 96 insertions(+), 42 deletions(-) diff --git a/cli/foundation-models/system/distillation/conversation/README.md b/cli/foundation-models/system/distillation/conversation/README.md index 0cc00dfca35..6ab8553af33 100644 --- a/cli/foundation-models/system/distillation/conversation/README.md +++ b/cli/foundation-models/system/distillation/conversation/README.md @@ -4,6 +4,7 @@ Ensure you have the proper setup. 1. Run `az version` and ensure the `ml` extension is installed. `ml` version should be greater or equal to 2.32.0. 2. If the `ml` extension is not installed, run `az extension add -n ml` +3. Currently the example yaml file uses Meta Llama 3.1 8B Instruct as the student model, however Phi 3 Mini 4k, Phi 3 Mini 128k, Phi 3.5 Mini, and Phi 3.5 MoE Instruct models are also supported student models. Update the .YAML file in this folder as needed. Run the Distillation CLI command pointing to the .YAML file in this folder and fill out the Azure ML IDs needed: diff --git a/cli/foundation-models/system/distillation/math/README.md b/cli/foundation-models/system/distillation/math/README.md index 2ee2c8781cc..22fdcce38e4 100644 --- a/cli/foundation-models/system/distillation/math/README.md +++ b/cli/foundation-models/system/distillation/math/README.md @@ -4,6 +4,7 @@ Ensure you have the proper setup. 1. Run `az version` and ensure the `ml` extension is installed. `ml` version should be greater or equal to 2.32.0. 2. If the `ml` extension is not installed, run `az extension add -n ml` +3. Currently the example yaml file uses Meta Llama 3.1 8B Instruct as the student model, however Phi 3 Mini 4k, Phi 3 Mini 128k, Phi 3.5 Mini, and Phi 3.5 MoE Instruct models are also supported student models. Update the .YAML file in this folder as needed. Run the Distillation CLI command pointing to the .YAML file in this folder and fill out the Azure ML IDs needed: diff --git a/cli/foundation-models/system/distillation/nli/README.md b/cli/foundation-models/system/distillation/nli/README.md index c7718737f7a..61dfcbc901a 100644 --- a/cli/foundation-models/system/distillation/nli/README.md +++ b/cli/foundation-models/system/distillation/nli/README.md @@ -4,6 +4,7 @@ Ensure you have the proper setup. 1. Run `az version` and ensure the `ml` extension is installed. `ml` version should be greater or equal to 2.32.0. 2. If the `ml` extension is not installed, run `az extension add -n ml` +3. Currently the example yaml file uses Meta Llama 3.1 8B Instruct as the student model, however Phi 3 Mini 4k, Phi 3 Mini 128k, Phi 3.5 Mini, and Phi 3.5 MoE Instruct models are also supported student models. Update the .YAML file in this folder as needed. Run the Distillation CLI command pointing to the .YAML file in this folder and fill out the Azure ML IDs needed: diff --git a/cli/foundation-models/system/distillation/nlu_qa/README.md b/cli/foundation-models/system/distillation/nlu_qa/README.md index ad301d26c19..59ed37d7de3 100644 --- a/cli/foundation-models/system/distillation/nlu_qa/README.md +++ b/cli/foundation-models/system/distillation/nlu_qa/README.md @@ -4,6 +4,7 @@ Ensure you have the proper setup. 1. Run `az version` and ensure the `ml` extension is installed. `ml` version should be greater or equal to 2.32.0. 2. If the `ml` extension is not installed, run `az extension add -n ml` +3. Currently the example yaml file uses Meta Llama 3.1 8B Instruct as the student model, however Phi 3 Mini 4k, Phi 3 Mini 128k, Phi 3.5 Mini, and Phi 3.5 MoE Instruct models are also supported student models. Update the .YAML file in this folder as needed. Run the Distillation CLI command pointing to the .YAML file in this folder and fill out the Azure ML IDs needed: diff --git a/cli/foundation-models/system/distillation/summarization/README.md b/cli/foundation-models/system/distillation/summarization/README.md index 6b292fb01c0..5aa821339a6 100644 --- a/cli/foundation-models/system/distillation/summarization/README.md +++ b/cli/foundation-models/system/distillation/summarization/README.md @@ -4,6 +4,7 @@ Ensure you have the proper setup. 1. Run `az version` and ensure the `ml` extension is installed. `ml` version should be greater or equal to 2.32.0. 2. If the `ml` extension is not installed, run `az extension add -n ml` +3. Currently the example yaml file uses Meta Llama 3.1 8B Instruct as the student model, however Phi 3 Mini 4k, Phi 3 Mini 128k, Phi 3.5 Mini, and Phi 3.5 MoE Instruct models are also supported student models. Update the .YAML file in this folder as needed. Run the Distillation CLI command pointing to the .YAML file in this folder and fill out the Azure ML IDs needed: diff --git a/sdk/python/foundation-models/system/distillation/README.md b/sdk/python/foundation-models/system/distillation/README.md index 85120ff7b94..1f4ed404085 100644 --- a/sdk/python/foundation-models/system/distillation/README.md +++ b/sdk/python/foundation-models/system/distillation/README.md @@ -14,6 +14,7 @@ description: An explanation on model distillaton and step-by-step guide on creat - [Welcome](#welcome) - [Getting Started](#getting-started) - [Model Distillation](#model-distillation) +- [Model Offerings](#model-offerings) - [Examples](#examples) @@ -38,6 +39,18 @@ The process of model distillation is a two stage process as seen below. 2. The second stage is finetuning. Once the synthetic data is collected, the student model is then finetuned off of the training and validation data created from the teacher model. This transfers the knowledge from the teacher model to the student model. +## Model Offerings +### Teacher Models +We currently support Meta Llama 3.1 405B Instruct as the teacher model for all distillation scenarios. + +### Student Models +- Meta Llama 3.1 8B Instruct +- Phi 3 Mini 4k Instruct +- Phi 3 Mini 128k Instruct +- Phi 3.5 Mini Instruct +- Phi 3.5 MoE Instruct + +We currently support Meta Llama 3.1 8B Instruct and Microsoft Phi3-Mini and Phi3.5 Instruct series models as student models. Fine-tuning of Meta Llama 3.1 Instruct series of models is only available in West US 3 region whereas Fine-tuning of Phi 3 and Phi 3.5 Instruct series of models is only available in East US 2 region. To use the distillation offering your workspace must be setup in the appropriate region for your selected student model. ## Examples We currently support numerous task types for model distillation. To view examples on how to distill and consume a model with the SDK, click on the following task type of interest diff --git a/sdk/python/foundation-models/system/distillation/conversation/distillation_conversational_task.ipynb b/sdk/python/foundation-models/system/distillation/conversation/distillation_conversational_task.ipynb index 4b9deb0a33e..2dee4eb2707 100644 --- a/sdk/python/foundation-models/system/distillation/conversation/distillation_conversational_task.ipynb +++ b/sdk/python/foundation-models/system/distillation/conversation/distillation_conversational_task.ipynb @@ -14,10 +14,10 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format.\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -94,11 +94,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -131,7 +133,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -404,7 +406,11 @@ "source": [ "#### Student Model\n", "Select the student model to use. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, { diff --git a/sdk/python/foundation-models/system/distillation/math/distillation_math.ipynb b/sdk/python/foundation-models/system/distillation/math/distillation_math.ipynb index abb817bb73d..a5925f480ef 100644 --- a/sdk/python/foundation-models/system/distillation/math/distillation_math.ipynb +++ b/sdk/python/foundation-models/system/distillation/math/distillation_math.ipynb @@ -14,10 +14,10 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format.\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -98,11 +98,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -135,7 +137,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -400,7 +402,11 @@ "source": [ "#### Student Model\n", "Select the student model to use. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, { diff --git a/sdk/python/foundation-models/system/distillation/nli/distillation_chat_completion.ipynb b/sdk/python/foundation-models/system/distillation/nli/distillation_chat_completion.ipynb index 41c8972b1f2..22297c28e8c 100644 --- a/sdk/python/foundation-models/system/distillation/nli/distillation_chat_completion.ipynb +++ b/sdk/python/foundation-models/system/distillation/nli/distillation_chat_completion.ipynb @@ -14,10 +14,10 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format.\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -97,11 +97,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -134,7 +136,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -399,7 +401,11 @@ "source": [ "#### Student Model\n", "Select the student model to use by using the model id of the model in the model catalog. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, { diff --git a/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_nlu_qa_task.ipynb b/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_nlu_qa_task.ipynb index fef6b250f6a..b7db7ced60a 100644 --- a/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_nlu_qa_task.ipynb +++ b/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_nlu_qa_task.ipynb @@ -14,10 +14,10 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format.\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -94,11 +94,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -131,7 +133,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -404,7 +406,11 @@ "source": [ "#### Student Model\n", "Select the student model to use. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, { diff --git a/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_qa_math.ipynb b/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_qa_math.ipynb index b12f8673bd2..75e69aea1a4 100644 --- a/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_qa_math.ipynb +++ b/sdk/python/foundation-models/system/distillation/nlu_qa/distillation_qa_math.ipynb @@ -14,10 +14,10 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format.\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -95,11 +95,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -132,7 +134,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -388,7 +390,11 @@ "source": [ "#### Student Model\n", "Select the student model to use. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, { diff --git a/sdk/python/foundation-models/system/distillation/summarization/distillation_summarization.ipynb b/sdk/python/foundation-models/system/distillation/summarization/distillation_summarization.ipynb index e06c100f65d..398ff2067cc 100644 --- a/sdk/python/foundation-models/system/distillation/summarization/distillation_summarization.ipynb +++ b/sdk/python/foundation-models/system/distillation/summarization/distillation_summarization.ipynb @@ -14,7 +14,6 @@ " \n", "**Note :**\n", " \n", - "- Distillation offering is only available in **West US 3** regions.\n", "- Distillation should only be used for single turn chat completion format as shown below\n", " ```json\n", " {\"messages\": [\n", @@ -23,7 +22,8 @@ " ]}\n", " ```\n", "- The Meta Llama 3.1 405B Instruct model can only be used as a teacher model.\n", - "- The Meta Llama 3.1 8B Instruct can only be used as a student (target) model.\n", + "- Distillation of a Meta Llama 3.1 8B Instruct student (target) model is only available in **West US 3** regions.\n", + "- Distillation of Phi3 or Phi3.5 student (target) models is only available in **East US 2** regions.\n", "\n", "**Prerequisites :**\n", "- Subscribe to the Meta Llama 3.1 405B Instruct and Meta Llama 3.1 8B Instruct, see [how to subscribe your project to the model offering in MS Learn](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-serverless?tabs=azure-ai-studio#subscribe-your-project-to-the-model-offering)" @@ -104,11 +104,13 @@ "\n", "### 1.3.1 Prerequisites\n", "\n", - "An AI Studio project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your AI Studio project\n", + "For distillation of a Meta Llama 3.1 8B student model, an Azure AI Foundry project in **West US 3** is required. Please follow [this](https://learn.microsoft.com/azure/ai-studio/how-to/fine-tune-model-llama?tabs=llama-two%2Cchatcompletion#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "### 1.3.2 AI Studio project settings\n", + "If you are using a Phi 3 or Phi 3.5 student model, an Azure AI Foundry project in **East US 2** is required. Follow [this](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/fine-tune-phi-3?tabs=phi-3-mini#prerequisites) document to setup your Azure AI Foundry project\n", "\n", - "Update following cell with the information of the AI Studio project just created." + "### 1.3.2 Azure AI Foundry project settings\n", + "\n", + "Update following cell with the information of the Azure AI Foundry project just created." ] }, { @@ -141,7 +143,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "### 1.3.3 Get handle to AI Studio project" + "### 1.3.3 Get handle to Azure AI Foundry project" ] }, { @@ -414,7 +416,11 @@ "source": [ "#### Student Model\n", "Select the student model to use. Supported student models:\n", - "1. Meta-Llama-3.1-8B-Instruct" + "1. Meta-Llama-3.1-8B-Instruct\n", + "2. Phi-3-Mini-4k-Instruct\n", + "3. Phi-3-Mini-128k-Instruct\n", + "4. Phi-3.5-Mini-Instruct\n", + "5. Phi-3.5-MoE-Instruct" ] }, {