Microsoft Agent Framework Microsoft.Extensions.AI Created: 27 Feb 2026 Updated: 27 Feb 2026

IChatClient: Functionality Pipelines

IChatClient instances can be layered into a pipeline using ChatClientBuilder. Each layer adds a cross-cutting concern (caching, function invocation, logging, rate limiting, etc.) and passes the request down to the next layer. The order of the Use* calls determines the order in which layers wrap each other.

Key Concepts

1. Building a Pipeline

Chain Use* extension methods on a ChatClientBuilder and finish with Build():

IChatClient client = new ChatClientBuilder(
new OpenAIClient(apiKey).GetChatClient("gpt-4o-mini").AsIChatClient())
.UseDistributedCache(cache) // layer 1 — cache hits answered here
.UseFunctionInvocation() // layer 2 — tool calls handled here
.Build(); // layer 3 — OpenAI (innermost)

2. Layer Order Matters

The first Use* call is the outermost layer — it intercepts requests first. In the example above, a cached response is returned without ever invoking any tools or calling OpenAI. If not cached, the request passes through to the function-invocation layer, and then to OpenAI.

3. Combined Pipeline with Tools and Cache

Pass ChatOptions.Tools as normal — the pipeline handles routing tool calls automatically:

var options = new ChatOptions
{
Tools = [ AIFunctionFactory.Create(GetExchangeRate, "GetExchangeRate", "...") ]
};

ChatResponse response = await client.GetResponseAsync("What is USD to EUR?", options);

4. Observing Cache vs. Live

Measure elapsed time to see whether a response came from cache (near 0 ms) or from the API (hundreds of ms):

var sw = Stopwatch.StartNew();
ChatResponse r = await client.GetResponseAsync(q, options);
sw.Stop();
Console.WriteLine($"[{sw.ElapsedMilliseconds} ms] {r.Text}");

Full Example

using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
using OpenAI;
using System.Diagnostics;

namespace MicrosoftAgentFrameworkLesson.ConsoleApp.ChatClient;

/// <summary>
/// Demonstrates a multi-layer IChatClient pipeline:
/// DistributedCache -> FunctionInvocation -> OpenAI
/// Scenario: Currency converter assistant.
/// Repeated questions are answered from cache; live rates come from tool calls.
/// </summary>
public static class PipelineDemo
{
private static string GetExchangeRate(string from, string to)
{
var key = $"{from.ToUpperInvariant()}/{to.ToUpperInvariant()}";
var rates = new Dictionary<string, double>
{
["USD/EUR"] = 0.92,
["USD/GBP"] = 0.79,
["EUR/USD"] = 1.09,
["GBP/USD"] = 1.27
};
return rates.TryGetValue(key, out var rate)
? $"1 {from.ToUpperInvariant()} = {rate} {to.ToUpperInvariant()}"
: $"Rate not available for {key}";
}

public static async Task RunAsync()
{
var apiKey = Environment.GetEnvironmentVariable("OPEN_AI_KEY")
?? throw new InvalidOperationException("Set OPEN_AI_KEY environment variable.");

IDistributedCache cache = new MemoryDistributedCache(
Options.Create(new MemoryDistributedCacheOptions()));

// Pipeline: Cache -> FunctionInvocation -> OpenAI
IChatClient client = new ChatClientBuilder(
new OpenAIClient(apiKey).GetChatClient("gpt-4o-mini").AsIChatClient())
.UseDistributedCache(cache)
.UseFunctionInvocation()
.Build();

var options = new ChatOptions
{
Tools =
[
AIFunctionFactory.Create(
GetExchangeRate,
"GetExchangeRate",
"Returns the exchange rate between two currencies (e.g. from=USD, to=EUR)")
]
};

Console.WriteLine("====== IChatClient — Functionality Pipeline ======\n");
Console.WriteLine("Pipeline layers: DistributedCache -> FunctionInvocation -> OpenAI\n");

string[] queries =
[
"What is the USD to EUR rate?",
"How many euros will I get for 200 US dollars?",
"What is the USD to EUR rate?" // served from cache
];

foreach (var q in queries)
{
var sw = Stopwatch.StartNew();
ChatResponse response = await client.GetResponseAsync(q, options);
sw.Stop();

Console.WriteLine($"Q: {q}");
Console.WriteLine($"A: {response.Text}");
Console.WriteLine($" [{sw.ElapsedMilliseconds} ms]");
Console.WriteLine();
}
}
}
Share this lesson: