A Guide to OpenAI and Semantic Kernel in .NET
In modern software development, integrating Large Language Models (LLMs) into daily workflows is becoming a standard. However, a common pitfall is hardcoding sensitive API keys directly into the source code. This article demonstrates how to securely handle your OpenAI API Key and use it within a .NET 9 console application to break down complex human tasks into actionable steps.
1. The Security Layer: Windows Environment Variables
Before writing code, you must store your API key in a location that is not tracked by version control systems like Git. On Windows, the most effective way is using Environment Variables.
How to set the variable:
- Open PowerShell or Command Prompt.
- Run the following command (replace the placeholder with your actual key):
- PowerShell
- Crucial Step: Restart your IDE (JetBrains Rider, Visual Studio, or VS Code) after running this command. Applications only "see" new environment variables when they are first launched.
2. The Implementation: Task Decomposition with Semantic Kernel
In this example, we use Microsoft Semantic Kernel to act as a "Task Organizer." The goal is to take a complex request—like preparing breakfast—and have the AI generate a sequence of simple instructions.
The .NET Code
3. Troubleshooting: Why is the Key Returning Null?
If you run the code and see the "Please set the environment variable" message, check these three common issues:
- The Typo Trap: Ensure the string in
Environment.GetEnvironmentVariable("OPEN_AI_KEY")matches your Windows variable name exactly. If you saved it asOPENAI_API_KEYin Windows but called itOPEN_AI_KEYin C#, it will returnnull. - The Restart Requirement: If you added the variable while your IDE was open, the IDE is still running on the "old" environment state. Restart the IDE completely.
- Scope Issues: Ensure you added it to User Variables (specific to you) rather than System Variables if you don't have Administrator rights.
4. Why Use Semantic Kernel?
While you could use a standard HttpClient to call OpenAI, Semantic Kernel provides several professional advantages:
- Abstraction: Easily switch between OpenAI, Azure OpenAI, or Hugging Face without changing your core logic.
- Middleware: Built-in support for logging, telemetry, and retry policies.
- Future-Proofing: It is designed to handle "Plugins" and "Functions," allowing the AI to eventually execute the tasks it generates.