Microsoft Agent Framework Middleware Created: 23 Feb 2026 Updated: 23 Feb 2026

Adding Middleware to Agents

Middleware lets you intercept and modify agent interactions at multiple levels without touching the agent or function logic itself. A common use is logging, but the same pattern applies to security validation, error handling, caching, and result transformation.

There are three middleware hooks available, each sitting at a different layer:

Middleware typeWhat it wraps
Agent Run MiddlewareEvery RunAsync() call on the agent.
Function Calling MiddlewareEvery tool/function the agent invokes during a run.
IChatClient MiddlewareEvery raw inference call to the underlying LLM service.

Step 1: Create a Simple Agent

Start with a basic agent that has one function tool. The tool returns the current date and time so we can trigger function calling middleware.

using System.ComponentModel;
using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
using OpenAI;

[Description("Returns the current local date and time.")]
static string GetDateTime() => DateTimeOffset.Now.ToString("f");

AIAgent baseAgent = new OpenAIClient("<your_api_key>")
.GetChatClient("gpt-4o-mini")
.AsIChatClient()
.AsAIAgent(
instructions: "You are a helpful assistant.",
name: "Assistant",
tools: [AIFunctionFactory.Create(GetDateTime, name: nameof(GetDateTime))]);

Step 2: Create Agent Run Middleware

An agent run middleware function receives the incoming messages and the inner agent. It must call innerAgent.RunAsync() to continue execution (unless it intentionally short-circuits). It returns the AgentResponse.

async Task<AgentResponse> AgentRunMiddleware(
IEnumerable<ChatMessage> messages,
AgentSession? session,
AgentRunOptions? options,
AIAgent innerAgent,
CancellationToken cancellationToken)
{
Console.WriteLine($"[AgentRunMiddleware] Input messages: {messages.Count()}");

AgentResponse response = await innerAgent
.RunAsync(messages, session, options, cancellationToken)
.ConfigureAwait(false);

Console.WriteLine($"[AgentRunMiddleware] Output messages: {response.Messages.Count}");

return response;
}

Provide the streaming variant alongside (ideally both together). When only the non-streaming variant is provided, the agent falls back to non-streaming mode even for streaming calls.

async IAsyncEnumerable<AgentResponseUpdate> AgentRunStreamingMiddleware(
IEnumerable<ChatMessage> messages,
AgentSession? session,
AgentRunOptions? options,
AIAgent innerAgent,
[EnumeratorCancellation] CancellationToken cancellationToken)
{
Console.WriteLine($"[StreamingMiddleware] Input messages: {messages.Count()}");

List<AgentResponseUpdate> updates = [];
await foreach (AgentResponseUpdate update in
innerAgent.RunStreamingAsync(messages, session, options, cancellationToken))
{
updates.Add(update);
yield return update;
}

Console.WriteLine($"[StreamingMiddleware] Total updates: {updates.Count}");
}

Step 3: Add Agent Run Middleware to the Agent

Use AsBuilder() on the base agent, add middleware with Use(), then call Build() to get a new agent with the middleware applied. The original agent is not modified.

AIAgent middlewareEnabledAgent = baseAgent
.AsBuilder()
.Use(runFunc: AgentRunMiddleware,
runStreamingFunc: AgentRunStreamingMiddleware)
.Build();

Console.WriteLine(await middlewareEnabledAgent.RunAsync("What is the current time?"));

Step 4: Create Function Calling Middleware

Function calling middleware intercepts every tool invocation. It receives a FunctionInvocationContext with the function name and arguments, and a next delegate to actually execute the function.

async ValueTask<object?> FunctionCallingMiddleware(
AIAgent agent,
FunctionInvocationContext context,
Func<FunctionInvocationContext, CancellationToken, ValueTask<object?>> next,
CancellationToken cancellationToken)
{
Console.WriteLine($"[FuncMiddleware] Calling: {context.Function.Name}");

object? result = await next(context, cancellationToken);

Console.WriteLine($"[FuncMiddleware] Result: {result}");

return result;
}

Note: Function calling middleware is only supported for agents using FunctionInvokingChatClient (e.g. ChatClientAgent).

Warning: You can stop the function call loop by setting context.Terminate = true, but this may leave the chat history in an inconsistent state and the session may become unusable for further runs.

Step 5: Add Function Calling Middleware to the Agent

AIAgent middlewareEnabledAgent = baseAgent
.AsBuilder()
.Use(FunctionCallingMiddleware)
.Build();

Console.WriteLine(await middlewareEnabledAgent.RunAsync("What is the current time?"));

Both middleware types can be chained together in a single builder:

AIAgent middlewareEnabledAgent = baseAgent
.AsBuilder()
.Use(runFunc: AgentRunMiddleware, runStreamingFunc: AgentRunStreamingMiddleware)
.Use(FunctionCallingMiddleware)
.Build();

Step 6: Create IChatClient Middleware

IChatClient middleware sits below the agent and intercepts every raw call to the inference service. It receives the raw message list and returns a ChatResponse.

async Task<ChatResponse> ChatClientMiddleware(
IEnumerable<ChatMessage> messages,
ChatOptions? options,
IChatClient innerChatClient,
CancellationToken cancellationToken)
{
Console.WriteLine($"[ChatClientMiddleware] Sending {messages.Count()} messages to LLM");

ChatResponse response = await innerChatClient
.GetResponseAsync(messages, options, cancellationToken);

Console.WriteLine($"[ChatClientMiddleware] Received {response.Messages.Count} message(s)");

return response;
}

Step 7: Add IChatClient Middleware

IChatClient middleware is added on the IChatClient before the agent is created, using the chat client builder pattern:

var chatClient = new OpenAIClient("<your_api_key>")
.GetChatClient("gpt-4o-mini")
.AsIChatClient();

var middlewareEnabledChatClient = chatClient
.AsBuilder()
.Use(getResponseFunc: ChatClientMiddleware, getStreamingResponseFunc: null)
.Build();

var agent = new ChatClientAgent(
middlewareEnabledChatClient,
instructions: "You are a helpful assistant.");

Alternatively, use the clientFactory parameter directly in the helper methods on SDK clients:

var agent = new OpenAIClient("<your_api_key>")
.GetChatClient("gpt-4o-mini")
.AsAIAgent(
"You are a helpful assistant.",
clientFactory: chatClient => chatClient
.AsBuilder()
.Use(getResponseFunc: ChatClientMiddleware, getStreamingResponseFunc: null)
.Build());

Execution Order

When all three types are active, the call flow for a single RunAsync("What is the current time?") looks like this:

[AgentRunMiddleware] -> Input messages : 1
[ChatClientMiddleware] -> Sending to LLM (first turn: decide to call function)
[ChatClientMiddleware] <- Received from LLM
[FuncMiddleware] -> Calling: GetDateTime
[FuncMiddleware] <- Result : Monday, 23 February 2026 14:32
[ChatClientMiddleware] -> Sending to LLM (second turn: produce final answer)
[ChatClientMiddleware] <- Received from LLM
[AgentRunMiddleware] <- Output messages: 3

Summary

Middleware typeFunction signatureRegistration
Agent RunTask<AgentResponse>(messages, session, options, innerAgent, ct)agent.AsBuilder().Use(runFunc: ...)
Agent Run StreamingIAsyncEnumerable<AgentResponseUpdate>(messages, session, options, innerAgent, ct)agent.AsBuilder().Use(runStreamingFunc: ...)
Function CallingValueTask<object?>(agent, context, next, ct)agent.AsBuilder().Use(funcCallFunc)
IChatClientTask<ChatResponse>(messages, options, innerChatClient, ct)chatClient.AsBuilder().Use(getResponseFunc: ...)

Required Packages

PackagePurpose
Microsoft.Agents.AIAIAgent, AgentResponse, FunctionInvocationContext, builder APIs
Microsoft.Extensions.AIIChatClient, ChatResponse, AIFunctionFactory
Microsoft.Extensions.AI.OpenAIAsIChatClient() and OpenAI integration
Share this lesson: