Some Resolutions Are Meant to Be Broken

Every new year begins with a list of resolutions—promises we make to ourselves, vowing to improve, cut back, or shift priorities. Some of these resolutions are necessary and life-changing, but others? They are meant to be broken.

Take my own experience as an example. At the end of 2024, I made a firm commitment: In 2025, I would do fewer events. The logic was sound—I wanted to reclaim time for deep work, personal projects, and perhaps a little breathing room. I told myself that after years of a packed calendar, it was time to scale back.

Fast forward to February 2025, and I can already admit: I have failed spectacularly. Not only did I not reduce the number of events I’m involved in, but I’m actually doing more than ever. I find myself saying yes to opportunities that align with my passion, expanding my reach, and engaging in discussions that truly matter.

Microsoft MVP summit, Valentine day Azure day, Linux Foundation Empowering night, aitour.microsoft.com

Why Do We Break Certain Resolutions?

1. Some Goals Sound Good in Theory, but Reality Has Other Plans

At the time, I believed that fewer events would equate to more focus. What I didn’t account for was that my nature—my passion for connecting, sharing knowledge, and building communities—would make this nearly impossible. When invitations and opportunities came knocking, I had to ask myself: Am I avoiding these for the sake of a resolution, or am I saying no to something that aligns with who I am?

2. Resolutions Should Evolve with Your Growth

The resolution to do fewer events was made at a time when I felt the need for change. But growth isn’t always about doing less; sometimes, it’s about doing more of the right things. In 2025, I’m not just doing more events—I’m choosing more meaningful ones, aligning with initiatives that have impact.

3. Passion Wins Over Restriction

Some resolutions require discipline—like exercising more or cutting down on distractions. But others, like limiting opportunities for engagement, can become artificial restrictions that go against your strengths. The key is recognizing when a resolution is serving you and when it’s holding you back.

The Lesson? Adjust, Don’t Abandon Growth

This experience has taught me that instead of setting arbitrary limits, I should focus on better curation. It’s not about fewer events—it’s about the right events. It’s about ensuring that each engagement adds value, aligns with my mission, and keeps me energized rather than drained.

So, if you find yourself breaking a resolution, ask yourself: Am I failing, or am I just evolving? Because some resolutions are meant to be broken, and sometimes, that’s exactly what needs to happen.

How to jump start for o3 on Azure!

Azure OpenAI Service now includes the new o3‑mini reasoning model—a lighter, cost‑efficient successor to earlier reasoning models (such as o1‑mini) that brings several new capabilities to the table. These enhancements include:

  • Reasoning Effort Control: Adjust the model’s cognitive load (low, medium, high) to balance response speed and depth.
  • Structured Outputs: Generate well‑defined, JSON‑structured responses to support automated workflows.
  • Functions and Tools Support: Seamlessly integrate with external functions to extend AI capabilities.
  • Developer Messages: A new “developer” role replaces the legacy system message, allowing for more flexible instruction handling.
  • Enhanced STEM Performance: Improved capabilities in coding, mathematics, and scientific reasoning.

In addition to these advances, Microsoft’s new o3‑mini is now complemented by Semantic Kernel—a powerful, open‑source SDK that enables developers to combine AI services (like Azure OpenAI) with custom code easily. Semantic Kernel provides an orchestration layer to integrate plugins, planners, and services, allowing you to build robust and modular AI applications in C#.


Prerequisites

Before getting started, ensure you have:

  • An Azure account with an Azure OpenAI Service resource provisioned.
  • Your API endpoint (e.g., https://<your-resource-name>.openai.azure.com/) and an API key.
  • A deployment for your o3‑mini model (e.g., “o3‑mini” or “o3‑mini‑high”).
  • .NET 8 (or later) and an IDE (e.g., Rider, Visual Studio or VS Code).
  • (Optional) Familiarity with Semantic Kernel and the corresponding NuGet packages.

Setting Up Your Project

  1. Create a New Console Application Open your terminal or IDE and run: dotnet new console -n AzureO3MiniDemo cd AzureO3MiniDemo
  2. Install Required NuGet Packages Install both the Azure OpenAI client library and Semantic Kernel: dotnet add package Azure.AI.OpenAI dotnet add package Microsoft.SemanticKernel Semantic Kernel provides a unified interface to orchestrate AI models and plugins.

Code Sample: Using o3‑mini with Semantic Kernel in C#

Below is a complete C# code sample demonstrating how to use the o3‑mini model from Azure OpenAI Service directly—and how to integrate Semantic Kernel to add an orchestration layer. This lets you later add custom functions (plugins) that can be automatically invoked by your agent.

Note: The code includes placeholders for new properties (like ReasoningEffort) and is structured to work with Semantic Kernel’s abstractions. Please consult the latest Semantic Kernel documentation for the precise API details.

using System;
using System.Threading.Tasks;
using Azure;
using Azure.AI.OpenAI;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;

namespace AzureO3MiniDemo
{
    // (Optional) Define an enum for reasoning effort if supported by your SDK version.
    public enum ReasoningEffort
    {
        Low,
        Medium,
        High
    }

    class Program
    {
        static async Task Main(string[] args)
        {
            // Replace with your Azure OpenAI endpoint and API key.
            string endpoint = "https://<your-resource-name>.openai.azure.com/";
            string apiKey = "<your-api-key>";
            // The deployment name for your o3-mini model.
            string deploymentName = "o3-mini";

            // Create an instance of OpenAIClient for direct API calls (if needed).
            OpenAIClient client = new OpenAIClient(new Uri(endpoint), new AzureKeyCredential(apiKey));

            // Now, set up Semantic Kernel and add the Azure OpenAI chat completion service.
            var kernelBuilder = Kernel.CreateBuilder();
            kernelBuilder.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey);
            
            // Optionally, add custom plugins here.
            // For example: kernelBuilder.Plugins.AddFromType<YourCustomPlugin>();

            Kernel kernel = kernelBuilder.Build();

            // Create a prompt and configure completion options.
            string prompt = "Write a short poem about the beauty of nature.";
            CompletionsOptions options = new CompletionsOptions()
            {
                Prompts = { prompt },
                MaxTokens = 100,
                Temperature = 0.7f
            };

            // NEW: Set the reasoning effort level (if supported).
            // options.ReasoningEffort = ReasoningEffort.Medium;

            // (Optional) Specify a JSON schema for structured outputs.
            // options.StructuredOutputSchema = "{ \"type\": \"object\", \"properties\": { \"poem\": { \"type\": \"string\" } } }";

            try
            {
                // Query the o3-mini model using the Semantic Kernel abstraction.
                Response<Completions> response = await kernel.GetService<IChatCompletionService>()
                    .GetCompletionsAsync(deploymentName, options);
                Completions completions = response.Value;

                Console.WriteLine("Response from o3-mini:");
                foreach (var choice in completions.Choices)
                {
                    Console.WriteLine(choice.Text.Trim());
                    Console.WriteLine(new string('-', 40));
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine($"An error occurred: {ex.Message}");
            }
        }
    }
}

Integrating Semantic Kernel Plugins

Semantic Kernel allows you to extend your application with custom plugins. For example, you can create functions that use Azure Search or other services and have them automatically invoked based on user input. This makes it easier to build AI agents that are both flexible and tailored to your business logic.

Example: Adding a Custom Plugin

Below is a simplified example of a custom plugin function that could be added to your Semantic Kernel setup. This plugin might, for instance, fetch additional context or data needed by your application:

using Microsoft.SemanticKernel.Plugins;
using System.Threading.Tasks;

public class CustomDataPlugin
{
    [KernelFunction, Description("Fetches additional context data for the prompt")]
    [return: Description("A string containing supplemental data.")]
    public async Task<string> GetSupplementalDataAsync([Description("Parameter for the data query")] string query)
    {
        // Your logic here, e.g., make an HTTP call to fetch data.
        await Task.Delay(100); // Simulate async operation.
        return $"Supplemental data for query: {query}";
    }
}

Once defined, you can register your plugin with the kernel builder:

kernelBuilder.Plugins.AddFromType<CustomDataPlugin>();

Semantic Kernel will now have the ability to call this plugin function automatically when the context of your user input suggests it is needed.


Running the Application

  1. Replace the placeholders for <your-resource-name> and <your-api-key> with your actual values.
  2. Save your changes and run the application using: dotnet run
  3. You should see an output similar to: Response from o3-mini: Nature whispers softly in the breeze, Dancing leaves tell secrets with ease. ----------------------------------------

Conclusion

This article demonstrates how to use the new o3‑mini model on Azure OpenAI Service with C# and how to further enhance your application by integrating Semantic Kernel. With Semantic Kernel, you can easily orchestrate AI functions, add custom plugins, and switch between providers (OpenAI vs. Azure OpenAI) with minimal changes to your codebase. This makes it an excellent tool for building sophisticated AI agents and applications.

For more details on Semantic Kernel, check out:

Happy coding!