IChatClient is the standard .NET abstraction for chat-capable AI services (defined in Microsoft.Extensions.AI). Any library that wraps an LLM — OpenAI, Azure OpenAI, Ollama, etc. — can implement this interface, so your application code stays the same regardless of the underlying model.
The primary method is GetResponseAsync, which sends one or more messages and returns a ChatResponse.
Key Concepts
1. Creating an IChatClient for OpenAI
Use the AsIChatClient() extension method to wrap an OpenAI ChatClient with the standard abstraction:
IChatClient client = new OpenAIClient(apiKey)
.GetChatClient("gpt-4o-mini")
.AsIChatClient();
2. Single Question
Pass a plain string for the simplest possible request:
ChatResponse response = await client.GetResponseAsync("What does RAM stand for?");
Console.WriteLine(response.Text);
3. System Prompt + User Message
Pass a list of ChatMessage objects to include a system prompt that shapes the model's behaviour:
ChatResponse response = await client.GetResponseAsync(
[
new ChatMessage(ChatRole.System, "You are a concise IT support agent. Answer in one sentence."),
new ChatMessage(ChatRole.User, "Why is my computer running slowly?")
]);
4. Multi-Turn Conversation History
Maintain a List<ChatMessage> and add both user messages and response messages after every turn. The helper history.AddMessages(response) extracts the assistant messages from the response and appends them for you:
var history = new List<ChatMessage>();
history.Add(new ChatMessage(ChatRole.User, "What OS does most of the internet run on?"));
ChatResponse r1 = await client.GetResponseAsync(history);
Console.WriteLine(r1.Text);
history.AddMessages(r1); // keep context for the next turn
history.Add(new ChatMessage(ChatRole.User, "What is its market share?"));
ChatResponse r2 = await client.GetResponseAsync(history);
Console.WriteLine(r2.Text);
Setup
<PackageReference Include="Microsoft.Extensions.AI" Version="10.3.0" />
<PackageReference Include="Microsoft.Extensions.AI.OpenAI" Version="10.3.0" />
set OPEN_AI_KEY=sk-...your-key...
Full Example
using Microsoft.Extensions.AI;
using OpenAI;
namespace MicrosoftAgentFrameworkLesson.ConsoleApp.ChatClient;
/// <summary>
/// Demonstrates IChatClient.GetResponseAsync:
/// single question, system prompt, and multi-turn conversation history.
/// Scenario: IT help desk assistant.
/// </summary>
public static class BasicChatDemo
{
public static async Task RunAsync()
{
var apiKey = Environment.GetEnvironmentVariable("OPEN_AI_KEY")
?? throw new InvalidOperationException("Set OPEN_AI_KEY environment variable.");
IChatClient client = new OpenAIClient(apiKey)
.GetChatClient("gpt-4o-mini")
.AsIChatClient();
Console.WriteLine("====== IChatClient — GetResponseAsync (Basic Chat) ======\n");
// Demo 1: Single question
Console.WriteLine("--- Demo 1: Single Question ---");
ChatResponse r1 = await client.GetResponseAsync("What does RAM stand for in computers?");
Console.WriteLine(r1.Text);
Console.WriteLine();
// Demo 2: System prompt + user message
Console.WriteLine("--- Demo 2: System Prompt ---");
ChatResponse r2 = await client.GetResponseAsync(
[
new ChatMessage(ChatRole.System, "You are a concise IT support agent. Answer in one sentence."),
new ChatMessage(ChatRole.User, "Why is my computer running slowly?")
]);
Console.WriteLine(r2.Text);
Console.WriteLine();
// Demo 3: Multi-turn conversation with history
Console.WriteLine("--- Demo 3: Multi-Turn Conversation ---");
var history = new List<ChatMessage>();
string[] questions =
[
"What operating system does most of the internet run on?",
"What is its market share among servers?",
"Name one popular web server software for it."
];
foreach (var question in questions)
{
Console.WriteLine($"Q: {question}");
history.Add(new ChatMessage(ChatRole.User, question));
ChatResponse response = await client.GetResponseAsync(history);
Console.WriteLine($"A: {response.Text}");
history.AddMessages(response);
Console.WriteLine();
}
}
}