Semantic Kernel Prompt Created: 23 Jan 2026 Updated: 23 Jan 2026

Leveraging Liquid Templates in Semantic Kernel

In basic AI implementations, a prompt is just a static block of text. However, real-world applications—like e-commerce support—demand high levels of personalization. You wouldn't treat a first-time guest the same way you treat a "Gold" VIP member.

To bridge this gap, Semantic Kernel offers the Liquid Prompt Template Factory. It allows you to embed complex logic—loops, conditionals, and object property access—directly into your prompts.

The Power of the Liquid Blueprint

Traditional templates use simple double-brace replacement (e.g., {{$name}}). Liquid templates, however, act more like code. They can evaluate the state of your data (like a user's membership tier) and change the instructions sent to the AI accordingly.

Key Advantages:

  1. Dynamic Personas: Change the AI's tone and instructions based on user data.
  2. List Handling: Seamlessly render order histories or product lists without manual string manipulation in C#.
  3. Real-time Context: Use built-in filters for formatting dates and numbers.

Practical Case Study: TechMart E-Commerce Support

Let’s examine a robust implementation of a support bot. In this scenario, the AI assistant needs to know the customer's spend history, membership status, and recent orders to provide a "tier-appropriate" response.

// See https://aka.ms/new-console-template for more information

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Microsoft.SemanticKernel.PromptTemplates.Liquid;

var apiKey = Environment.GetEnvironmentVariable("OPEN_AI_KEY");

if (string.IsNullOrEmpty(apiKey))
{
Console.WriteLine("Please set the OPEN_AI_KEY environment variable.");
return;
}

var kernel = Kernel.CreateBuilder()
.AddOpenAIChatCompletion(
"gpt-4o",
apiKey)
.Build();

// Fix console encoding for emojis
Console.OutputEncoding = System.Text.Encoding.UTF8;

Console.WriteLine("=== 🛍️ TechMart E-Commerce Support Chat ===\n");
Console.WriteLine("Welcome to TechMart Customer Support!");
Console.WriteLine("I'm your AI assistant, ready to help with your orders and questions.\n");

// 1. Customer Profile Setup
Console.Write("Please enter your first name: ");
var firstName = Console.ReadLine() ?? "Guest";

Console.Write("Select your membership tier (Gold/Silver/Standard): ");
var membershipInput = Console.ReadLine() ?? "Standard";

// Normalize membership to proper case
var membership = membershipInput.ToLower() switch
{
"gold" => "Gold",
"silver" => "Silver",
_ => "Standard"
};

Console.WriteLine($"\n✨ Welcome {firstName}! Your membership tier: {membership}");
Console.WriteLine("Type 'exit' to end the conversation.\n");
Console.WriteLine(new string('=', 80) + "\n");

// 2. Define customer data as Dictionary (Liquid works better with dictionaries)
var totalSpent = membership == "Gold" ? 1250.50 : membership == "Silver" ? 450.00 : 125.00;

var customer = new Dictionary<string, object>
{
["FirstName"] = firstName,
["LastName"] = "Customer",
["Email"] = $"{firstName.ToLower()}@email.com",
["Membership"] = membership,
["IsAccountActive"] = true,
["TotalSpent"] = totalSpent,
["JoinedDate"] = "2023-05-15"
};

var orders = new List<Dictionary<string, object>>
{
new() { ["Id"] = "ORD-101", ["Item"] = "Wireless Headphones", ["Status"] = "Delivered", ["Price"] = 120.00, ["DeliveryDate"] = "2026-01-15" },
new() { ["Id"] = "ORD-205", ["Item"] = "Mechanical Keyboard", ["Status"] = "In Transit", ["Price"] = 85.50, ["DeliveryDate"] = "2026-01-25" },
new() { ["Id"] = "ORD-309", ["Item"] = "USB-C Cable", ["Status"] = "Cancelled", ["Price"] = 15.00, ["DeliveryDate"] = "N/A" }
};

// 3. Define the Liquid System Prompt Template
var systemPromptTemplate = """
You are a professional customer support assistant for TechMart E-Commerce.
Current Date: {{ "now" | date: "%Y-%m-%d %H:%M" }}
Support Hours: {{ support_hours }}

# Customer Profile
- Name: {{ user.FirstName }} {{ user.LastName }}
- Email: {{ user.Email }}
- Membership Tier: {{ user.Membership }}
- Account Status: {% if user.IsAccountActive %}Active{% else %}Inactive{% endif %}
- Total Spent: ${{ user.TotalSpent }}
- Member Since: {{ user.JoinedDate }}

# Tier Instructions
{% if user.Membership == 'Gold' %}
🌟 VIP GOLD MEMBER - Priority: HIGHEST
- Offer 10% discount code: GOLD10
- Free express shipping on all orders
- 24/7 priority support available
- Tone: Extremely polite and grateful for loyalty
{% elsif user.Membership == 'Silver' %}
✨ VALUED SILVER MEMBER - Priority: HIGH
- Offer 5% shipping discount: SILVER5
- Free shipping on orders over $50
- Priority email support (2 hour reply)
- Tone: Warm and appreciative
{% else %}
📦 STANDARD MEMBER - Priority: NORMAL
- Welcome discount available
- Standard shipping rates apply
- Tone: Friendly and helpful
{% endif %}

# Order History ({{ order_history.size }} orders)
{% for order in order_history %}
- Order {{ order.Id }}: {{ order.Item }} - {{ order.Status }} (${{ order.Price }})
{% endfor %}

# Special Alerts
{% if user.TotalSpent > 1000 %}
🎁 Customer spent over $1,000 - Offer loyalty rewards!
{% endif %}
{% if order_history.size > 2 %}
💎 Frequent buyer - Show extra appreciation!
{% endif %}

Instructions: Be concise, professional, and tier-appropriate. Help with orders, shipping, returns, and product questions.
""";

// 4. Configure Liquid Prompt
var promptConfig = new PromptTemplateConfig
{
Name = "SupportSystemPrompt",
Template = systemPromptTemplate,
TemplateFormat = "liquid",
Description = "Dynamic system prompt with customer context",
InputVariables =
[
new() { Name = "user", AllowDangerouslySetContent = true },
new() { Name = "order_history", AllowDangerouslySetContent = true },
new() { Name = "support_hours" }
]
};

// 5. Initialize Liquid Factory
var liquidFactory = new LiquidPromptTemplateFactory();
var template = liquidFactory.Create(promptConfig);

// 6. Prepare kernel arguments
var kernelArguments = new KernelArguments
{
{ "user", customer },
{ "order_history", orders },
{ "support_hours", "9 AM - 5 PM EST" }
};

// 7. Render system prompt
var systemPrompt = await template.RenderAsync(kernel, kernelArguments);

// Display the rendered system prompt
Console.WriteLine("=== 📋 RENDERED SYSTEM PROMPT (Sent to AI) ===\n");
Console.WriteLine(systemPrompt);
Console.WriteLine("\n" + new string('=', 80) + "\n");

// 8. Initialize Chat History with rendered system prompt
var chat = kernel.GetRequiredService<IChatCompletionService>();
var history = new ChatHistory();
history.AddSystemMessage(systemPrompt);

// Add initial greeting
history.AddAssistantMessage($"Hello {firstName}! 👋 I'm your TechMart support assistant. How can I help you today?");
Console.WriteLine($"Assistant: Hello {firstName}! 👋 I'm your TechMart support assistant. How can I help you today?\n");

// 9. Configure execution settings
var executionSettings = new OpenAIPromptExecutionSettings
{
MaxTokens = 500,
Temperature = 0.7
};

// 10. Main conversation loop
while (true)
{
Console.Write($"{firstName} >>> ");
var userInput = Console.ReadLine();
if (string.IsNullOrWhiteSpace(userInput))
{
continue;
}
if (userInput.ToLower() == "exit")
{
Console.WriteLine("\nAssistant: Thank you for contacting TechMart! Have a great day! 👋");
break;
}
// Add user message to history
history.AddUserMessage(userInput);
// Get AI response with streaming
Console.Write("Assistant: ");
string fullMessage = string.Empty;
await foreach (var token in chat.GetStreamingChatMessageContentsAsync(history, executionSettings, kernel))
{
Console.Write(token.Content);
fullMessage += token.Content;
}
Console.WriteLine("\n");
// Add assistant response to history
history.AddAssistantMessage(fullMessage);
}

// 11. Display conversation summary
Console.WriteLine("\n" + new string('=', 80));
Console.WriteLine("=== 📊 Conversation Summary ===\n");
Console.WriteLine($"Total messages: {history.Count}");
Console.WriteLine($"Customer: {customer["FirstName"]} ({customer["Membership"]} Member)");
Console.WriteLine($"Total spent: ${customer["TotalSpent"]}");
Console.WriteLine("\nThank you for using TechMart Support! 🛍️");

Breaking Down the Workflow

The transition from data to a smart AI response involves three critical stages:

1. Contextual Logic (if/elsif)

In the code above, the AI instructions are mutated by the user.Membership property. Instead of passing "Be polite" as a separate variable, the prompt decides for itself based on the customer data.

2. Collection Processing (for-loop)

The {% for order in order_history %} block allows the AI to see a structured list of every order. This is much more efficient than concatenating strings in a loop within your C# service.

3. The Factory Role

The LiquidPromptTemplateFactory is the glue. It implements IPromptTemplateFactory and handles the transformation of your dictionary objects into the final text. It ensures that complex properties like user.TotalSpent are correctly formatted before being sent to models like GPT-4o.

Share this lesson: