Microsoft Agent Framework Agents Created: 18 Feb 2026 Updated: 18 Feb 2026

Observability with OpenTelemetry and .NET Aspire

Observability allows you to understand what your agent is doing at runtime — how long model calls take, how many tokens are consumed, which tools are invoked, and what log messages are produced. The Microsoft Agent Framework integrates with OpenTelemetry out of the box and emits traces, metrics, and logs that can be sent directly to the .NET Aspire Dashboard for real-time visualization.

What Does the Agent Framework Emit?

Traces (Spans)

  1. invoke_agent {agent_name} — top-level span, one per agent turn
  2. chat {model_name} — child span for every model call; includes token counts and (optionally) the full prompt and response text
  3. execute_tool {function_name} — child span for each tool invocation

Metrics

  1. gen_ai.client.operation.duration — histogram of model call durations in seconds
  2. gen_ai.client.token.usage — histogram of input and output token counts
  3. agent_framework.function.invocation.duration — histogram of tool execution durations

Logs

All ILogger output from the Agent Framework and your own application code is routed through the OpenTelemetry log provider and forwarded to Aspire via OTLP.

How the Aspire Integration Works

When you start the app through the AppHost project, .NET Aspire automatically injects the OTEL_EXPORTER_OTLP_ENDPOINT environment variable into every child process it manages. The OTLP exporter in your app reads this variable and sends all telemetry data to the Aspire Dashboard without any hard-coded URLs.

dotnet run --project MicrosoftAgentFrameworkLesson.AppHost

After starting, open http://localhost:18888 to view traces, metrics, and logs in the Aspire Dashboard.

Required NuGet Packages

<!-- Already in the project -->
<PackageReference Include="Microsoft.Agents.AI" Version="1.0.0-preview.260212.1" />
<PackageReference Include="Microsoft.Extensions.AI.OpenAI" Version="10.3.0" />

<!-- Added for Lesson 7 -->
<PackageReference Include="OpenTelemetry" Version="1.11.2" />
<PackageReference Include="OpenTelemetry.Exporter.OpenTelemetryProtocol" Version="1.11.2" />

The OpenTelemetry package provides the core SDK builders (Sdk.CreateTracerProviderBuilder, Sdk.CreateMeterProviderBuilder, and the AddOpenTelemetry logging extension). The OpenTelemetry.Exporter.OpenTelemetryProtocol package provides AddOtlpExporter() for all three signal types.

ObservabilitySetup.cs — OpenTelemetry Configuration

This class creates and owns the three OpenTelemetry providers (traces, metrics, logs) and builds the instrumented agent. It implements IDisposable so that all buffered telemetry is flushed before the process exits.

using Microsoft.Agents.AI;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Logging;
using OpenAI;
using OpenTelemetry;
using OpenTelemetry.Logs;
using OpenTelemetry.Metrics;
using OpenTelemetry.Resources;
using OpenTelemetry.Trace;

namespace MicrosoftAgentFrameworkLesson.ConsoleApp.Lesson7;

/// <summary>
/// Sets up OpenTelemetry providers (traces, metrics, logs) and creates an
/// instrumented agent. Uses the OTLP exporter so that all telemetry is
/// forwarded to the .NET Aspire Dashboard automatically when the app is
/// started via the AppHost (Aspire injects OTEL_EXPORTER_OTLP_ENDPOINT).
///
/// Implement IDisposable so that TracerProvider and MeterProvider are
/// flushed and shut down cleanly before the process exits.
/// </summary>
public sealed class ObservabilitySetup : IDisposable
{
/// <summary>Activity source name used for OTel source registration and agent instrumentation.</summary>
public const string SourceName = "MicrosoftAgentFrameworkLesson";

private readonly TracerProvider _tracerProvider;
private readonly MeterProvider _meterProvider;

public ILoggerFactory LoggerFactory { get; }

public ObservabilitySetup()
{
var resource = ResourceBuilder
.CreateDefault()
.AddService(serviceName: SourceName);

// ── Traces ──────────────────────────────────────────────────────────
// AddSource registers the activity sources we want to capture:
// • SourceName → spans we emit ourselves
// • *Microsoft.Extensions.AI → chat-client level spans (model calls)
// • *Microsoft.Extensions.Agents* → agent-level spans (invoke_agent, execute_tool)
// AddOtlpExporter reads OTEL_EXPORTER_OTLP_ENDPOINT from the environment.
_tracerProvider = Sdk.CreateTracerProviderBuilder()
.SetResourceBuilder(resource)
.AddSource(SourceName)
.AddSource("*Microsoft.Extensions.AI")
.AddSource("*Microsoft.Extensions.Agents*")
.AddOtlpExporter()
.Build();

// ── Metrics ─────────────────────────────────────────────────────────
// AddMeter("*Microsoft.Agents.AI") captures the Agent Framework meters:
// • gen_ai.client.operation.duration (model call duration)
// • gen_ai.client.token.usage (input/output token counts)
_meterProvider = Sdk.CreateMeterProviderBuilder()
.SetResourceBuilder(resource)
.AddMeter(SourceName
.AddMeter("*Microsoft.Agents.AI")
.AddOtlpExporter()
.Build();

// ── Logs ─────────────────────────────────────────────────────────────
// AddOpenTelemetry routes .NET ILogger output through the OTel SDK
// so that structured log records appear in the Aspire Dashboard.
// AddOtlpExporter() is in the OpenTelemetry.Logs namespace.
LoggerFactory = Microsoft.Extensions.Logging.LoggerFactory.Create(logging =>
{
logging
.SetMinimumLevel(LogLevel.Debug)
.AddOpenTelemetry(otel =>
{
otel.SetResourceBuilder(resource);
otel.IncludeFormattedMessage = true;
otel.IncludeScopes = true;
otel.AddOtlpExporter(); // OpenTelemetry.Logs namespace
});
});
}

/// <summary>
/// Creates an agent wrapped with OpenTelemetry instrumentation via
/// AIAgentBuilder.UseOpenTelemetry.
///
/// The OpenTelemetryAgent wrapper emits these spans:
/// • invoke_agent {name} — top-level span for the full agent turn
/// • chat {model} — child span for each model call
///
/// With EnableSensitiveData = true the prompt and response text are
/// recorded in span attributes. Only enable this in development.
/// </summary>
public AIAgent CreateAgent(string modelName = "gpt-4o")
{
var apiKey = Environment.GetEnvironmentVariable("OPEN_AI_KEY")
?? throw new InvalidOperationException(
"Set the OPEN_AI_KEY environment variable before running this demo.");

// Build the inner agent (ChatClientAgent backed by OpenAI)
var innerAgent = new OpenAIClient(apiKey)
.GetChatClient(modelName)
.AsIChatClient()
.AsAIAgent(new ChatClientAgentOptions
{
Name = "ObservabilityDemoAgent",
Description = "An agent demonstrating OpenTelemetry observability."
});

// Wrap with OpenTelemetry instrumentation using the AIAgentBuilder pipeline
return new AIAgentBuilder(innerAgent)
.UseOpenTelemetry(
sourceName: SourceName,
configure: otelAgent => otelAgent.EnableSensitiveData = true)
.Build();
}

public void Dispose()
{
// Flush remaining telemetry before the process exits
_tracerProvider.Dispose();
_meterProvider.Dispose();
LoggerFactory.Dispose();
}
}

Key points in ObservabilitySetup

  1. ResourceBuilder.CreateDefault().AddService(SourceName) — attaches a service name to every piece of telemetry so the Aspire Dashboard knows which service produced it.
  2. AddSource("*Microsoft.Extensions.AI") — the wildcard prefix * matches the experimental source name Experimental.Microsoft.Extensions.AI used by the chat client pipeline. Without this prefix, model-call spans would not be captured.
  3. AddOtlpExporter() (no arguments) — reads OTEL_EXPORTER_OTLP_ENDPOINT from the environment. Aspire injects this variable automatically; you never need to hard-code the URL.
  4. AIAgentBuilder.UseOpenTelemetry() — wraps the inner agent in an OpenTelemetryAgent that emits invoke_agent spans and augments child chat spans with agent-specific attributes.
  5. EnableSensitiveData = true — records the full prompt and response text in span attributes. Only use this in development; never in production.
  6. IDisposable — the using statement in Program.cs guarantees that buffered telemetry is flushed before the process exits, so no data is lost.

Demo1_BasicObservability.cs — Sending a Prompt and Capturing Telemetry

using Microsoft.Agents.AI;
using Microsoft.Extensions.Logging;

namespace MicrosoftAgentFrameworkLesson.ConsoleApp.Lesson7;

/// <summary>
/// DEMO 1 — Agent Observability with .NET Aspire
///
/// Sends a single prompt to an instrumented agent and demonstrates
/// that the following telemetry flows to the .NET Aspire Dashboard:
///
/// Traces
/// invoke_agent ObservabilityDemoAgent ← top-level agent span
/// chat gpt-4o ← model call span (child)
///
/// Metrics
/// gen_ai.client.operation.duration ← how long the model call took
/// gen_ai.client.token.usage ← input and output token counts
///
/// Logs
/// Debug/Information log records from Microsoft.Agents.AI and
/// Microsoft.Extensions.AI routed to the Aspire Dashboard via OTLP.
///
/// Prerequisites:
/// • Start the project via the AppHost (dotnet run --project AppHost).
/// Aspire injects OTEL_EXPORTER_OTLP_ENDPOINT automatically.
/// • Set the OPEN_AI_KEY environment variable (or user secrets).
/// </summary>
public static class Demo1_BasicObservability
{
public static async Task RunAsync(ObservabilitySetup setup)
{
Console.WriteLine("╔═══════════════════════════════════════════════╗");
Console.WriteLine("║ DEMO 1 — Agent Observability with Aspire ║");
Console.WriteLine("╚═══════════════════════════════════════════════╝\n");

// Check whether Aspire injected the OTLP endpoint
var otlpEndpoint = Environment.GetEnvironmentVariable("OTEL_EXPORTER_OTLP_ENDPOINT");
if (string.IsNullOrWhiteSpace(otlpEndpoint))
{
Console.WriteLine("[WARNING] OTEL_EXPORTER_OTLP_ENDPOINT is not set.");
Console.WriteLine(" Start the app via the Aspire AppHost to see telemetry");
Console.WriteLine(" in the Aspire Dashboard (http://localhost:18888).\n");
}
else
{
Console.WriteLine($"[OK] OTLP endpoint: {otlpEndpoint}");
Console.WriteLine(" Open http://localhost:18888 in a browser to see telemetry.\n");
}

// Create an application logger so that our own log lines
// are also sent to Aspire via the OpenTelemetry log provider
var logger = setup.LoggerFactory.CreateLogger<Demo1_BasicObservability_Marker>();

// Create the instrumented agent (see ObservabilitySetup.CreateAgent)
AIAgent agent = setup.CreateAgent();

// A session keeps conversation history across turns
AgentSession session = await agent.CreateSessionAsync();

const string prompt = "Explain what OpenTelemetry is in two sentences.";

Console.WriteLine($"Prompt : {prompt}\n");
logger.LogInformation("Sending prompt to agent: {Prompt}", prompt);

// RunAsync emits:
// • "invoke_agent ObservabilityDemoAgent" trace span
// • "chat gpt-4o" child span with token usage attributes
// • gen_ai.client.operation.duration and gen_ai.client.token.usage metrics
AgentResponse response = await agent.RunAsync(prompt, session);

Console.WriteLine($"Response:\n{response.Text}\n");
logger.LogInformation("Agent response received ({Tokens} output tokens)",
response.Usage?.OutputTokenCount ?? 0);

Console.WriteLine("\n✓ Demo 1 completed.");
Console.WriteLine(" Check the Aspire Dashboard for:");
Console.WriteLine(" • Traces → invoke_agent / chat spans");
Console.WriteLine(" • Metrics → gen_ai.client.operation.duration, gen_ai.client.token.usage");
Console.WriteLine(" • Logs → structured log entries from this demo");
}
}

// Marker type used only as the category name for ILogger<T>
file sealed class Demo1_BasicObservability_Marker;

Key points in Demo1

  1. setup.LoggerFactory.CreateLogger<T>() — creates a logger whose output is routed through the OTel log provider set up in ObservabilitySetup, so every LogInformation call appears as a structured log record in the Aspire Dashboard.
  2. agent.RunAsync(prompt, session) — no extra options are needed; the OpenTelemetryAgent wrapper intercepts the call and emits spans and metrics automatically.
  3. response.Usage?.OutputTokenCount — the token usage is available on the response and also captured automatically in the gen_ai.client.token.usage metric.

Program.cs — Entry Point

using MicrosoftAgentFrameworkLesson.ConsoleApp.Lesson7;

// ObservabilitySetup owns the TracerProvider, MeterProvider and LoggerFactory.
// "using" ensures they are flushed and disposed before the process exits
// so that no telemetry data is lost.
using var setup = new ObservabilitySetup();

await Demo1_BasicObservability.RunAsync(setup);

The using var declaration is critical. When the program finishes, the providers are disposed in order, which triggers a final flush of any buffered spans and metrics to the OTLP endpoint.

What to See in the Aspire Dashboard

Traces tab

After running the demo, the Traces tab shows one trace with two spans:

  1. invoke_agent ObservabilityDemoAgent — duration of the entire agent turn
  2. chat gpt-4o (child) — duration of the model API call, plus attributes:
  3. gen_ai.usage.input_tokens
  4. gen_ai.usage.output_tokens
  5. gen_ai.request.model
  6. Full prompt and response text (because EnableSensitiveData = true)

Metrics tab

  1. gen_ai.client.operation.duration — histogram showing how long the model call took
  2. gen_ai.client.token.usage — histogram showing token counts (broken down by input/output)

Logs tab

Structured log records from the demo appear here, including the Sending prompt to agent and Agent response received messages logged via ILogger.

OpenTelemetry Source Names to Register

AddSource / AddMeter patternWhat it captures
*Microsoft.Extensions.AIChat-client pipeline spans (chat {model})
*Microsoft.Extensions.Agents*Agent pipeline spans (invoke_agent, execute_tool)
*Microsoft.Agents.AIAgent Framework metrics (duration, token usage)

The * prefix is required because the SDK emits these spans under an experimental source name prefixed with Experimental.. The wildcard ensures both the stable and experimental source names are matched.

Best Practices

  1. Never enable sensitive data in production. Set otelAgent.EnableSensitiveData = true only in development or test environments. Sensitive data includes the full prompt and response text.
  2. Always dispose providers before exit. Use using var setup = new ObservabilitySetup(); in Program.cs to guarantee the final OTLP flush.
  3. Use AddOtlpExporter() without arguments. This reads OTEL_EXPORTER_OTLP_ENDPOINT from the environment, which Aspire injects automatically. Never hard-code the URL.
  4. Start from the AppHost, not from the ConsoleApp directly. Only the AppHost injects the OTLP endpoint environment variable. Running the ConsoleApp directly will produce output to the console but no telemetry will reach the Aspire Dashboard.
Share this lesson: