Azure OpenAI Service now includes the new o3‑mini reasoning model—a lighter, cost‑efficient successor to earlier reasoning models (such as o1‑mini) that brings several new capabilities to the table. These enhancements include:
- Reasoning Effort Control: Adjust the model’s cognitive load (low, medium, high) to balance response speed and depth.
- Structured Outputs: Generate well‑defined, JSON‑structured responses to support automated workflows.
- Functions and Tools Support: Seamlessly integrate with external functions to extend AI capabilities.
- Developer Messages: A new “developer” role replaces the legacy system message, allowing for more flexible instruction handling.
- Enhanced STEM Performance: Improved capabilities in coding, mathematics, and scientific reasoning.
In addition to these advances, Microsoft’s new o3‑mini is now complemented by Semantic Kernel—a powerful, open‑source SDK that enables developers to combine AI services (like Azure OpenAI) with custom code easily. Semantic Kernel provides an orchestration layer to integrate plugins, planners, and services, allowing you to build robust and modular AI applications in C#.
Prerequisites
Before getting started, ensure you have:
- An Azure account with an Azure OpenAI Service resource provisioned.
- Your API endpoint (e.g.,
https://<your-resource-name>.openai.azure.com/
) and an API key. - A deployment for your o3‑mini model (e.g., “o3‑mini” or “o3‑mini‑high”).
- .NET 8 (or later) and an IDE (e.g., Rider, Visual Studio or VS Code).
- (Optional) Familiarity with Semantic Kernel and the corresponding NuGet packages.
Setting Up Your Project
- Create a New Console Application Open your terminal or IDE and run:
dotnet new console -n AzureO3MiniDemo cd AzureO3MiniDemo
- Install Required NuGet Packages Install both the Azure OpenAI client library and Semantic Kernel:
dotnet add package Azure.AI.OpenAI dotnet add package Microsoft.SemanticKernel
Semantic Kernel provides a unified interface to orchestrate AI models and plugins.
Code Sample: Using o3‑mini with Semantic Kernel in C#
Below is a complete C# code sample demonstrating how to use the o3‑mini model from Azure OpenAI Service directly—and how to integrate Semantic Kernel to add an orchestration layer. This lets you later add custom functions (plugins) that can be automatically invoked by your agent.
Note: The code includes placeholders for new properties (like
ReasoningEffort
) and is structured to work with Semantic Kernel’s abstractions. Please consult the latest Semantic Kernel documentation for the precise API details.
using System; using System.Threading.Tasks; using Azure; using Azure.AI.OpenAI; using Microsoft.SemanticKernel; using Microsoft.SemanticKernel.ChatCompletion; namespace AzureO3MiniDemo { // (Optional) Define an enum for reasoning effort if supported by your SDK version. public enum ReasoningEffort { Low, Medium, High } class Program { static async Task Main(string[] args) { // Replace with your Azure OpenAI endpoint and API key. string endpoint = "https://<your-resource-name>.openai.azure.com/"; string apiKey = "<your-api-key>"; // The deployment name for your o3-mini model. string deploymentName = "o3-mini"; // Create an instance of OpenAIClient for direct API calls (if needed). OpenAIClient client = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(apiKey)); // Now, set up Semantic Kernel and add the Azure OpenAI chat completion service. var kernelBuilder = Kernel.CreateBuilder(); kernelBuilder.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey); // Optionally, add custom plugins here. // For example: kernelBuilder.Plugins.AddFromType<YourCustomPlugin>(); Kernel kernel = kernelBuilder.Build(); // Create a prompt and configure completion options. string prompt = "Write a short poem about the beauty of nature."; CompletionsOptions options = new CompletionsOptions() { Prompts = { prompt }, MaxTokens = 100, Temperature = 0.7f }; // NEW: Set the reasoning effort level (if supported). // options.ReasoningEffort = ReasoningEffort.Medium; // (Optional) Specify a JSON schema for structured outputs. // options.StructuredOutputSchema = "{ \"type\": \"object\", \"properties\": { \"poem\": { \"type\": \"string\" } } }"; try { // Query the o3-mini model using the Semantic Kernel abstraction. Response<Completions> response = await kernel.GetService<IChatCompletionService>() .GetCompletionsAsync(deploymentName, options); Completions completions = response.Value; Console.WriteLine("Response from o3-mini:"); foreach (var choice in completions.Choices) { Console.WriteLine(choice.Text.Trim()); Console.WriteLine(new string('-', 40)); } } catch (Exception ex) { Console.WriteLine($"An error occurred: {ex.Message}"); } } } }
Integrating Semantic Kernel Plugins
Semantic Kernel allows you to extend your application with custom plugins. For example, you can create functions that use Azure Search or other services and have them automatically invoked based on user input. This makes it easier to build AI agents that are both flexible and tailored to your business logic.
Example: Adding a Custom Plugin
Below is a simplified example of a custom plugin function that could be added to your Semantic Kernel setup. This plugin might, for instance, fetch additional context or data needed by your application:
using Microsoft.SemanticKernel.Plugins; using System.Threading.Tasks; public class CustomDataPlugin { [KernelFunction, Description("Fetches additional context data for the prompt")] [return: Description("A string containing supplemental data.")] public async Task<string> GetSupplementalDataAsync([Description("Parameter for the data query")] string query) { // Your logic here, e.g., make an HTTP call to fetch data. await Task.Delay(100); // Simulate async operation. return $"Supplemental data for query: {query}"; } }
Once defined, you can register your plugin with the kernel builder:
kernelBuilder.Plugins.AddFromType<CustomDataPlugin>();
Semantic Kernel will now have the ability to call this plugin function automatically when the context of your user input suggests it is needed.
Running the Application
- Replace the placeholders for
<your-resource-name>
and<your-api-key>
with your actual values. - Save your changes and run the application using:
dotnet run
- You should see an output similar to:
Response from o3-mini: Nature whispers softly in the breeze, Dancing leaves tell secrets with ease. ----------------------------------------
Conclusion
This article demonstrates how to use the new o3‑mini model on Azure OpenAI Service with C# and how to further enhance your application by integrating Semantic Kernel. With Semantic Kernel, you can easily orchestrate AI functions, add custom plugins, and switch between providers (OpenAI vs. Azure OpenAI) with minimal changes to your codebase. This makes it an excellent tool for building sophisticated AI agents and applications.
For more details on Semantic Kernel, check out:
Happy coding!
I used o3 yesterday 🙂
https://www.instagram.com/p/DFkF3hWpeaw/?img_index=1