Microsoft Agent Framework Workflows Created: 25 Feb 2026 Updated: 25 Feb 2026

Agents in Workflows

This lesson shows how to embed AI agents directly inside workflows. Instead of writing custom Executor classes with deterministic logic, you create ChatClientAgent instances backed by a real LLM (OpenAI in this example) and wire them together with WorkflowBuilder. The framework treats every agent as a first-class executor: it receives messages, calls the model, and forwards its response to the next step.

Key Concepts

1. ChatClientAgent

ChatClientAgent wraps an IChatClient (from Microsoft.Extensions.AI) and turns it into a workflow executor. You create one with the extension method AsAIAgent():

IChatClient chatClient = new OpenAIClient(apiKey)
.GetChatClient("gpt-4o-mini")
.AsIChatClient();

var agent = chatClient.AsAIAgent(
name: "Chef",
instructions: "You are a professional chef...");

Each agent keeps its own system prompt (instructions) while sharing the same underlying IChatClient for API calls.

2. TurnToken

Agents do not start processing the moment they receive a message. They cache incoming messages and wait for a TurnToken. You send it after opening the streaming run:

await run.TrySendMessageAsync(new TurnToken(emitEvents: true));

Setting emitEvents: true tells the agent to emit AgentResponseUpdateEvent tokens while generating its answer.

3. AgentResponseUpdateEvent

As an agent streams its LLM response, the workflow emits one AgentResponseUpdateEvent per token. Each event carries:

  1. ExecutorId — the agent that produced the token
  2. Data — the token text

You consume them through WatchStreamAsync():

await foreach (var evt in run.WatchStreamAsync())
{
if (evt is AgentResponseUpdateEvent update)
Console.Write(update.Data);
}

4. Agents as Executors

Because ChatClientAgent implements the executor interface, you use the same WorkflowBuilder fluent API you already know. No special adapter is needed — pass the agent straight to AddEdge():

var workflow = new WorkflowBuilder(chefAgent)
.AddEdge(chefAgent, nutritionistAgent)
.AddEdge(nutritionistAgent, presenterAgent)
.Build();

Setting Up OpenAI

The demo uses the Microsoft.Extensions.AI.OpenAI package. Make sure these NuGet packages are referenced:

<PackageReference Include="Microsoft.Agents.AI" Version="1.0.0-rc1" />
<PackageReference Include="Microsoft.Agents.AI.Workflows" Version="1.0.0-rc1" />
<PackageReference Include="Microsoft.Extensions.AI.OpenAI" Version="10.3.0" />

Set your API key as an environment variable before running:

set OPENAI_API_KEY=sk-...your-key...

Building a Linear Agent Pipeline

In the first demo, three agents form a sequential chain: Chef → Nutritionist → Presenter. The Chef creates a recipe from ingredients, the Nutritionist adds a calorie analysis, and the Presenter writes an appetizing summary.

var chef = chatClient.AsAIAgent(
name: "Chef",
instructions: "You are a professional chef. "
+ "Given a list of ingredients, create a very short recipe (2-3 sentences).");

var nutritionist = chatClient.AsAIAgent(
name: "Nutritionist",
instructions: "Given a recipe, provide a brief nutritional summary (2 sentences).");

var presenter = chatClient.AsAIAgent(
name: "Presenter",
instructions: "Combine the recipe and nutritional info into one appealing paragraph.");

var workflow = new WorkflowBuilder(chef)
.AddEdge(chef, nutritionist)
.AddEdge(nutritionist, presenter)
.Build();

Run it with streaming:

await using var run = await InProcessExecution.RunStreamingAsync(
workflow,
new ChatMessage(ChatRole.User, "chicken, garlic, lemon, rosemary"));

await run.TrySendMessageAsync(new TurnToken(emitEvents: true));

await foreach (var evt in run.WatchStreamAsync())
{
if (evt is AgentResponseUpdateEvent update)
Console.Write($"[{update.ExecutorId}] {update.Data}");
}

Fan-Out Agent Pipeline

The second demo sends the Chef's recipe to two agents at the same time: Nutritionist and Critic. Both run in the same superstep (BSP model), so they process in parallel:

var workflow = new WorkflowBuilder(chef)
.AddEdge(chef, nutritionist) // fan-out edge 1
.AddEdge(chef, critic) // fan-out edge 2
.Build();

The streaming output interleaves tokens from both agents. You can distinguish them via update.ExecutorId.

Important Notes

  1. Workflows own their executors. Create fresh agents for each new workflow — do not reuse agent instances across workflow runs.
  2. TurnToken is required. Without sending a TurnToken, agents will cache messages indefinitely and never call the LLM.
  3. Streaming is the natural fit. Use RunStreamingAsync + WatchStreamAsync to see agent responses token-by-token.

Full Example

using Microsoft.Agents.AI;
using Microsoft.Agents.AI.Workflows;
using Microsoft.Extensions.AI;
using OpenAI;

namespace MicrosoftAgentFrameworkLesson.ConsoleApp.Agents;

/// <summary>
/// Demonstrates Agents in Workflows — Recipe Creation Pipeline.
/// Uses OpenAI ChatClientAgent instances directly as workflow executors.
/// </summary>
public static class RecipePipelineDemo
{
public static async Task RunAsync()
{
// ── OpenAI Setup ─────────────────────────────────────
var apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY")
?? throw new InvalidOperationException(
"Set the OPENAI_API_KEY environment variable before running.");

IChatClient chatClient = new OpenAIClient(apiKey)
.GetChatClient("gpt-4o-mini")
.AsIChatClient();

Console.WriteLine("====== Agents in Workflows — Recipe Pipeline ======\n");

await Demo1_LinearPipelineAsync(chatClient);
await Demo2_FanOutPipelineAsync(chatClient);
}

// ──────────────────────────────────────────────────────────
// Demo 1 — Linear Agent Pipeline (Streaming)
// Chef → Nutritionist → Presenter
// ──────────────────────────────────────────────────────────
private static async Task Demo1_LinearPipelineAsync(IChatClient chatClient)
{
Console.WriteLine("--- Demo 1: Linear Agent Pipeline (Streaming) ---");
Console.WriteLine("Input: chicken, garlic, lemon, rosemary\n");

var chef = chatClient.AsAIAgent(
name: "Chef",
instructions: "You are a professional chef. "
+ "Given a list of ingredients, create a very short recipe (2-3 sentences). "
+ "Only output the recipe text.");

var nutritionist = chatClient.AsAIAgent(
name: "Nutritionist",
instructions: "You are a nutritionist. "
+ "Given a recipe, provide a brief nutritional summary: approximate calories "
+ "per serving and two key nutrients. Keep it to 2 sentences.");

var presenter = chatClient.AsAIAgent(
name: "Presenter",
instructions: "You create appetizing food summaries. "
+ "Combine the recipe and nutritional info into one short, appealing paragraph "
+ "(2-3 sentences). Only output the final summary.");

// Build a sequential workflow: Chef → Nutritionist → Presenter
var workflow = new WorkflowBuilder(chef)
.AddEdge(chef, nutritionist)
.AddEdge(nutritionist, presenter)
.Build();

// Stream execution — agents emit AgentResponseUpdateEvent for each token
await using var run = await InProcessExecution.RunStreamingAsync(
workflow,
new ChatMessage(ChatRole.User, "chicken, garlic, lemon, rosemary"));

// TurnToken triggers actual LLM processing (agents cache messages until this)
await run.TrySendMessageAsync(new TurnToken(emitEvents: true));

string currentAgent = "";
await foreach (var evt in run.WatchStreamAsync())
{
if (evt is AgentResponseUpdateEvent update)
{
if (update.ExecutorId != currentAgent)
{
if (currentAgent != "") Console.WriteLine("\n");
currentAgent = update.ExecutorId;
Console.Write($"[{currentAgent}] ");
}
Console.Write(update.Data);
}
}
Console.WriteLine("\n");
}

// ──────────────────────────────────────────────────────────
// Demo 2 — Fan-Out Agent Pipeline (Streaming)
// Chef → Nutritionist (parallel — same superstep)
// → Critic
// ──────────────────────────────────────────────────────────
private static async Task Demo2_FanOutPipelineAsync(IChatClient chatClient)
{
Console.WriteLine("--- Demo 2: Fan-Out Agent Pipeline (Streaming) ---");
Console.WriteLine("Input: salmon, avocado, rice, soy sauce\n");

var chef = chatClient.AsAIAgent(
name: "Chef",
instructions: "You are a chef. Given ingredients, create a short recipe "
+ "(2-3 sentences). Only output the recipe.");

var nutritionist = chatClient.AsAIAgent(
name: "Nutritionist",
instructions: "Briefly analyze this recipe's nutritional value (1-2 sentences).");

var critic = chatClient.AsAIAgent(
name: "Critic",
instructions: "As a food critic, give a brief rating and opinion (1-2 sentences).");

// Fan-out: Chef output goes to both Nutritionist and Critic in parallel
var workflow = new WorkflowBuilder(chef)
.AddEdge(chef, nutritionist)
.AddEdge(chef, critic)
.Build();

await using var run = await InProcessExecution.RunStreamingAsync(
workflow,
new ChatMessage(ChatRole.User, "salmon, avocado, rice, soy sauce"));

await run.TrySendMessageAsync(new TurnToken(emitEvents: true));

string currentAgent = "";
await foreach (var evt in run.WatchStreamAsync())
{
if (evt is AgentResponseUpdateEvent update)
{
if (update.ExecutorId != currentAgent)
{
if (currentAgent != "") Console.WriteLine("\n");
currentAgent = update.ExecutorId;
Console.Write($"[{currentAgent}] ");
}
Console.Write(update.Data);
}
}
Console.WriteLine("\n");
}
}


Share this lesson: