Response Types in the Microsoft Agent Framework
When you call an agent in the Microsoft Agent Framework, the response carries all content the agent produced — not just the final answer. This includes text results, function tool calls, function results, usage data, reasoning text, and more. Understanding how to navigate these response types is essential to building robust agent-powered applications.
Why Does This Matter?
Consider a scenario where the agent calls a weather API tool before answering your question. The response will contain:
- A
FunctionCallContent— the agent’s request to call the tool - A
FunctionResultContent— the result from that tool call - A
TextContent— the actual answer the user sees - A
UsageContent— token consumption data
If you treat the entire response as the “answer,” you’ll get noise. The framework gives you a convenient .Text property that aggregates only the TextContent items, but you can also inspect every content item individually.
Non-Streaming: AgentResponse
RunAsync waits for the model to finish generating and returns a single AgentResponse object. This is the simplest approach.
Getting the Text Result
Under the hood, .Text iterates over every ChatMessage in response.Messages, collects all TextContent items, and concatenates their text. You can do the same manually:
Inspecting Messages and Content Items
response.Messages gives you access to every ChatMessage produced during the agent’s run. Each message has a Role (e.g., assistant) and a Contents collection of typed items:
Metadata and Token Usage
Streaming: AgentResponseUpdate
RunStreamingAsync returns an IAsyncEnumerable<AgentResponseUpdate>. Each update is a small chunk that may contain a text fragment, a function call, usage data, or other content. Text arrives token-by-token, giving the user a real-time experience.
Simple Streaming
Most chunks carry a small piece of text. Some chunks may have an empty Text — they contain only metadata or non-text content items.
Detailed Chunk Inspection
Just like the non-streaming case, each AgentResponseUpdate has a Contents collection where you can pattern-match individual items:
Measuring Performance: Time-to-First-Token
One key advantage of streaming is lower perceived latency. While the total generation time is similar, the user sees the first token almost immediately. Here’s how to measure both:
Content Types at a Glance
| Type | What It Represents | Included in .Text? |
TextContent | The actual text answer from the model | Yes |
FunctionCallContent | The agent requesting a tool/function call | No |
FunctionResultContent | The result returned from a tool/function call | No |
UsageContent | Token usage information (input/output counts) | No |
Non-Streaming vs Streaming: When to Use Which?
| Aspect | Non-Streaming (RunAsync) | Streaming (RunStreamingAsync) |
| Return type | AgentResponse | IAsyncEnumerable<AgentResponseUpdate> |
| First output | After full generation completes | As soon as the first token arrives |
| Best for | Processing, storing, or chaining results | Real-time UI, chat interfaces, long responses |
| Text extraction | response.Text | Accumulate update.Text in a loop |
| Content inspection | message.Contents per message | update.Contents per chunk |
| Perceived latency | Higher (waits for complete response) | Lower (first token arrives quickly) |
Key Takeaways
.Textis your shortcut — bothAgentResponseandAgentResponseUpdatehave a.Textproperty that extracts only the text result, filtering out tool calls, usage data, and other non-result content.- Not all content is the answer — responses may include
FunctionCallContent,FunctionResultContent,UsageContent, and more. Use pattern matching on.Contentsto inspect them. - Both approaches produce the same result — the only difference is how the response is delivered. Non-streaming returns everything at once; streaming delivers it chunk by chunk.
- Streaming reduces perceived latency — the user sees text appearing immediately instead of waiting for the full generation. This is especially valuable for long responses and real-time chat interfaces.