Overview
The exercise walks you through building a generative AI chat app using the Azure AI Foundry SDK. You deploy the gpt-4o model in the Azure AI Foundry portal and then create a client application that interacts with that model. Both Python and C# implementations are provided.
Repository and Environment Setup
Clone the repository and navigate to the correct folder:
Python:
rm -r mslearn-ai-foundry -f git clone https://github.com/microsoftlearning/mslearn-ai-studio mslearn-ai-foundry cd mslearn-ai-foundry/labfiles/chat-app/python
C#:
rm -r mslearn-ai-foundry -f git clone https://github.com/microsoftlearning/mslearn-ai-studio mslearn-ai-foundry cd mslearn-ai-foundry/labfiles/chat-app/c-sharp
Install dependencies:
Python:
python -m venv labenv ./labenv/bin/Activate.ps1 pip install python-dotenv azure-identity azure-ai-projects azure-ai-inference
C#:
dotnet add package Azure.Identity dotnet add package Azure.AI.Projects --version 1.0.0-beta.9 dotnet add package Azure.AI.Inference --version 1.0.0-beta.5
Configuration
Update the configuration file with your Azure AI Foundry project details:
- Python: Edit the
.env
file. - C#: Edit the
appsettings.json
file.
Replace the placeholders:
your_project_endpoint
→ Your actual project endpoint.your_model_deployment
→ The name of your gpt-4 deployment.
Code References
Adding References
Python:
# Add references from dotenv import load_dotenv from azure.identity import DefaultAzureCredential from azure.ai.projects import AIProjectClient from azure.ai.inference.models import SystemMessage, UserMessage, AssistantMessage
C#:
// Add references using Azure.Identity; using Azure.AI.Projects; using Azure.AI.Inference;
Initializing the Project Client
Python:
# Initialize the project client projectClient = AIProjectClient( credential=DefaultAzureCredential( exclude_environment_credential=True, exclude_managed_identity_credential=True ), endpoint=project_connection, )
C#:
// Initialize the project client DefaultAzureCredentialOptions options = new() { ExcludeEnvironmentCredential = true, ExcludeManagedIdentityCredential = true }; var projectClient = new AIProjectClient( new Uri(project_connection), new DefaultAzureCredential(options) );
Getting a Chat Client
Python:
# Get a chat client chat = projectClient.inference.get_chat_completions_client()
C#:
// Get a chat client ChatCompletionsClient chat = projectClient.GetChatCompletionsClient();
Initializing the Prompt with a System Message
Python:
# Initialize prompt with system message prompt = [SystemMessage("You are a helpful AI assistant that answers questions.")]
C#:
// Initialize prompt with system message var prompt = new List<ChatRequestMessage>(){ new ChatRequestSystemMessage("You are a helpful AI assistant that answers questions.") };
Chat Interaction Loop
Python:
# Get a chat completion prompt.append(UserMessage(input_text)) response = chat.complete(model=model_deployment, messages=prompt) completion = response.choices[0].message.content print(completion) prompt.append(AssistantMessage(completion))
C#:
// Get a chat completion prompt.Add(new ChatRequestUserMessage(input_text)); var requestOptions = new ChatCompletionsOptions() { Model = model_deployment, Messages = prompt }; Response<ChatCompletions> response = chat.Complete(requestOptions); var completion = response.Value.Content; Console.WriteLine(completion); prompt.Add(new ChatRequestAssistantMessage(completion));
Running the Application
- Python: Run using:
python chat-app.py
- C#: Run using:
dotnet run
- Python: Run using: