Asp.Net Core Security Rate Limit Created: 24 Jan 2026 Updated: 24 Jan 2026

Concurrency Rate Limiting Strategy

Table of Contents

  1. Introduction
  2. How It Works
  3. Implementation
  4. Resource Protection
  5. Queue Management
  6. Real-World Scenarios
  7. When to Use
  8. Complete Example

Introduction

Concurrency Limiter is fundamentally different from other rate limiting strategies. Instead of limiting requests per time window, it limits simultaneous operations. Think of it as a semaphore for your API endpoints.

Key Difference:

Rate Limiter (Fixed Window: 60/min):
00:00 → Accept 60 requests instantly
00:01 → All 60 finish (if fast)
00:01-01:00 → Reject ALL requests
01:00 → Accept next 60

Concurrency Limiter (5 concurrent):
00:00 → Accept first 5 requests
00:00-∞ → Always processing 5 requests
As each completes → Next one starts
Continuous throughput ✓

Why Concurrency Limiter?

  1. Protects bounded resources - Database connections, memory
  2. Prevents exhaustion - No resource starvation
  3. Continuous flow - Always processing at capacity
  4. Simple concept - Easy to understand and configure

How It Works

The Semaphore Model

Concurrency Limit: 5
Queue Limit: 10

┌─────────────────────────────────────┐
│ Active Processing (5 slots) │
│ ┌───┐ ┌───┐ ┌───┐ ┌───┐ ┌───┐ │
│ │ 1 │ │ 2 │ │ 3 │ │ 4 │ │ 5 │ │
│ └───┘ └───┘ └───┘ └───┘ └───┘ │
└─────────────────────────────────────┘
↓ Request completes
┌─────────────────────────────────────┐
│ Queue (10 slots) │
│ ┌───┐ ┌───┐ ┌───┐ ┌───┐ │
│ │ 6 │ │ 7 │ │ 8 │ │ 9 │ ... │
│ └───┘ └───┘ └───┘ └───┘ │
└─────────────────────────────────────┘
↓ Queue full
Request 16+ → REJECT ❌

State Machine

Request Arrives
Is Active < Limit?
├─ YES → Process immediately ✓
└─ NO → Is Queue < QueueLimit?
├─ YES → Add to queue 🟡
└─ NO → Reject (429) ❌

When request completes:
Dequeue next request
Process dequeued request ✓

Core Algorithm

Concurrency State:
- limit: maximum concurrent operations
- active_count: currently processing
- queue: waiting requests
- queue_limit: maximum queue size

For each request:
lock (state):
if active_count < limit:
active_count++
return PROCESS_NOW ✓
else if queue.Count < queue_limit:
queue.Enqueue(request)
return QUEUED 🟡
else:
return REJECT ❌

On request completion:
lock (state):
active_count--
if queue.Count > 0:
next_request = queue.Dequeue()
active_count++
process(next_request) ✓

Implementation

Complete C# Implementation

using Microsoft.AspNetCore.RateLimiting;

namespace SecurityApp.API.RateLimiting;

/// <summary>
/// Concurrency Rate Limiter
/// Limits simultaneous operations, not the rate of requests.
/// Perfect for resource-intensive endpoints.
/// </summary>
public static class ConcurrencyRateLimiter
{
public const string PolicyName = "Concurrency";

/// <summary>
/// Configuration:
/// - PermitLimit: 5 concurrent requests maximum
/// - QueueLimit: 10 requests can wait
///
/// Behavior:
/// Requests 1-5: Process immediately
/// Requests 6-15: Wait in queue
/// Request 16+: Rejected (429)
///
/// When request completes:
/// Next queued request starts processing
///
/// Use Cases:
/// - Database-heavy operations
/// - File uploads/downloads
/// - Long-running reports
/// - External API calls
/// </summary>
public static void Configure(RateLimiterOptions rateLimiterOptions)
{
rateLimiterOptions.AddConcurrencyLimiter(PolicyName, options =>
{
options.PermitLimit = 5;
options.QueueProcessingOrder = System.Threading.RateLimiting.QueueProcessingOrder.OldestFirst;
options.QueueLimit = 10;
});
}
}

Registration

// Program.cs
var builder = WebApplication.CreateBuilder(args);

builder.Services.AddRateLimiter(options =>
{
// Register Concurrency Limiter
ConcurrencyRateLimiter.Configure(options);
// Rejection handler
options.OnRejected = async (context, cancellationToken) =>
{
context.HttpContext.Response.StatusCode = 429;
await context.HttpContext.Response.WriteAsJsonAsync(new
{
error = "concurrency_limit_exceeded",
message = "Maximum concurrent requests exceeded. Queue is full.",
active_requests = 5,
queue_full = true,
strategy = "concurrency",
hint = "Retry after a short delay"
}, cancellationToken);
};
});

var app = builder.Build();
app.UseRateLimiter();

// Apply to resource-intensive endpoints
app.MapGet("/api/reports/heavy", async () =>
{
// Simulate heavy database query
await Task.Delay(2000); // 2 seconds
return Results.Ok(new { report = "data" });
})
.RequireRateLimiting(ConcurrencyRateLimiter.PolicyName);

app.Run();

Multiple Concurrency Pools

builder.Services.AddRateLimiter(options =>
{
// Database queries: Protect connection pool
options.AddConcurrencyLimiter("database", opt =>
{
opt.PermitLimit = 10; // 10 concurrent DB operations
opt.QueueLimit = 50; // Queue up to 50
});
// File uploads: Protect disk I/O
options.AddConcurrencyLimiter("upload", opt =>
{
opt.PermitLimit = 3; // 3 concurrent uploads
opt.QueueLimit = 10; // Small queue
});
// External API: Protect circuit breaker
options.AddConcurrencyLimiter("external-api", opt =>
{
opt.PermitLimit = 5; // 5 concurrent external calls
opt.QueueLimit = 20;
});
// Report generation: Protect CPU/memory
options.AddConcurrencyLimiter("reports", opt =>
{
opt.PermitLimit = 2; // Only 2 concurrent reports
opt.QueueLimit = 5; // Small queue (reports are slow)
});
});

Resource Protection

Database Connection Pool

Problem: Connection Pool Exhaustion

// Scenario: Database has 100 connections
// Heavy query takes 5 seconds

// WITHOUT Concurrency Limiter:
app.MapGet("/api/reports", async (DbContext db) =>
{
// 100 simultaneous requests = 100 connections
var data = await db.Reports
.Include(r => r.Details)
.Include(r => r.Analytics)
.ToListAsync(); // Heavy query
return Results.Ok(data);
});

// Problem: 200 requests → Pool exhausted → Crash! 💥

// WITH Concurrency Limiter:
app.MapGet("/api/reports", async (DbContext db) =>
{
var data = await db.Reports
.Include(r => r.Details)
.Include(r => r.Analytics)
.ToListAsync();
return Results.Ok(data);
})
.RequireRateLimiting("database");

// Solution:
// - Max 10 concurrent queries
// - Others wait in queue
// - Connection pool stays healthy ✓

Memory Protection

// Scenario: Report generation uses 500MB per request

options.AddConcurrencyLimiter("memory-intensive", opt =>
{
opt.PermitLimit = 4; // 4 × 500MB = 2GB max
opt.QueueLimit = 10;
});

app.MapPost("/api/reports/generate", async (ReportRequest request) =>
{
// Generate large report (500MB memory)
var report = await GenerateLargeReport(request);
return Results.Ok(report);
})
.RequireRateLimiting("memory-intensive");

// Benefit:
// - Prevents OutOfMemoryException
// - Server stays stable under load
// - Graceful degradation

CPU Protection

// Scenario: Video transcoding uses 100% CPU

options.AddConcurrencyLimiter("cpu-intensive", opt =>
{
opt.PermitLimit = Environment.ProcessorCount - 1; // Leave 1 core
opt.QueueLimit = 20;
});

app.MapPost("/api/video/transcode", async (VideoRequest request) =>
{
// CPU-intensive transcoding
var result = await TranscodeVideo(request);
return Results.Ok(result);
})
.RequireRateLimiting("cpu-intensive");

// Benefit:
// - Prevents CPU saturation
// - System remains responsive
// - Other endpoints still work

Queue Management

Queue Processing Order

// FIFO (First In First Out) - DEFAULT
options.QueueProcessingOrder = QueueProcessingOrder.OldestFirst;

Timeline:
00:00 → Request A arrives → Active
00:01 → Request B arrives → Active
00:02 → Request C arrives → Active
00:03 → Request D arrives → Active
00:04 → Request E arrives → Active
00:05 → Request F arrives → Queued (position 1)
00:06 → Request G arrives → Queued (position 2)
00:07 → Request A completes → Request F starts (oldest in queue)
00:08 → Request B completes → Request G starts

// LIFO (Last In First Out)
options.QueueProcessingOrder = QueueProcessingOrder.NewestFirst;

00:07 → Request A completes → Request G starts (newest in queue)
00:08 → Request B completes → Request F starts

Queue Size Calculation

Configuration Decision:

Small Queue (queue_limit ≈ concurrency_limit):
- Use: Fast operations (< 1 second)
- Benefit: Quick rejection, fast feedback
- Example: opt.PermitLimit = 5, opt.QueueLimit = 5

Medium Queue (queue_limit = 2-5x concurrency_limit):
- Use: Medium operations (1-5 seconds)
- Benefit: Handle bursts, reasonable wait
- Example: opt.PermitLimit = 5, opt.QueueLimit = 15

Large Queue (queue_limit = 10x+ concurrency_limit):
- Use: Slow operations (> 5 seconds)
- Benefit: Accept all requests, long waits
- Example: opt.PermitLimit = 2, opt.QueueLimit = 50
- Warning: Can cause timeout issues!

Queue Timeout

// Problem: Requests wait too long in queue

// Solution 1: Configure timeout
builder.Services.AddRateLimiter(options =>
{
options.AddConcurrencyLimiter("with-timeout", opt =>
{
opt.PermitLimit = 5;
opt.QueueLimit = 10;
});
options.OnRejected = async (context, ct) =>
{
// Log queue wait time
if (context.Lease.TryGetMetadata(
System.Threading.RateLimiting.MetadataName.RetryAfter,
out var retryAfter))
{
var estimatedWait = retryAfter.TotalSeconds;
if (estimatedWait > 30)
{
context.HttpContext.Response.StatusCode = 503; // Service Unavailable
await context.HttpContext.Response.WriteAsJsonAsync(new
{
error = "queue_timeout",
message = "Estimated wait time exceeds limit",
estimated_wait_seconds = estimatedWait
}, ct);
return;
}
}
context.HttpContext.Response.StatusCode = 429;
await context.HttpContext.Response.WriteAsJsonAsync(new
{
error = "queue_full",
message = "Maximum concurrent requests exceeded"
}, ct);
};
});

Real-World Scenarios

Scenario 1: File Upload Service

// Problem: Unlimited concurrent uploads = Disk I/O bottleneck

options.AddConcurrencyLimiter("file-upload", opt =>
{
opt.PermitLimit = 3; // 3 concurrent uploads
opt.QueueLimit = 10; // Queue 10 more
});

app.MapPost("/api/files/upload", async (IFormFile file) =>
{
// Upload to disk (I/O bound)
var path = Path.Combine("uploads", file.FileName);
using var stream = File.Create(path);
await file.CopyToAsync(stream);
return Results.Ok(new { uploaded = file.FileName });
})
.RequireRateLimiting("file-upload")
.DisableAntiforgery();

// Benefit:
// - Prevents disk I/O saturation
// - Maintains upload performance
// - Predictable completion times

Scenario 2: Report Generation

// Problem: Reports are slow and memory-intensive

options.AddConcurrencyLimiter("reports", opt =>
{
opt.PermitLimit = 2; // Only 2 concurrent
opt.QueueLimit = 5; // Small queue
});

app.MapPost("/api/reports/generate", async (ReportRequest request) =>
{
// Generate report (takes 30 seconds, uses 500MB)
await Task.Delay(30000); // Simulate work
var report = await GenerateReport(request);
return Results.Ok(report);
})
.RequireRateLimiting("reports");

// Behavior:
// Request 1-2: Start immediately
// Request 3-7: Queue (wait up to 2 minutes)
// Request 8+: Reject (queue full)

Scenario 3: External API Integration

// Problem: Third-party API has concurrency limits

options.AddConcurrencyLimiter("external-api", opt =>
{
opt.PermitLimit = 10; // Their limit
opt.QueueLimit = 50; // Buffer for bursts
});

app.MapGet("/api/weather/{city}", async (string city, IHttpClientFactory factory) =>
{
var client = factory.CreateClient("weather-api");
var response = await client.GetAsync($"/api/current/{city}");
var data = await response.Content.ReadFromJsonAsync<WeatherData>();
return Results.Ok(data);
})
.RequireRateLimiting("external-api");

// Benefit:
// - Respects external API limits
// - Prevents 429 from external service
// - Protects circuit breaker

Scenario 4: WebSocket Connections

// Problem: Limited WebSocket capacity

options.AddConcurrencyLimiter("websocket", opt =>
{
opt.PermitLimit = 1000; // Max 1000 concurrent connections
opt.QueueLimit = 100; // Queue 100 during bursts
});

app.Map("/ws", async (HttpContext context) =>
{
if (context.WebSockets.IsWebSocketRequest)
{
var webSocket = await context.WebSockets.AcceptWebSocketAsync();
await HandleWebSocketAsync(webSocket);
}
else
{
context.Response.StatusCode = 400;
}
})
.RequireRateLimiting("websocket");

// Benefit:
// - Prevents connection exhaustion
// - Maintains service quality
// - Graceful rejection when full

When to Use

✅ Perfect For

1. Database-Heavy Operations

// Protect connection pool
app.MapGet("/api/reports/complex", ComplexQuery)
.RequireRateLimiting("Concurrency");

2. File I/O Operations

// Upload, download, processing
app.MapPost("/api/files/upload", UploadFile)
.RequireRateLimiting("Concurrency");

3. Long-Running Tasks

// Report generation, data export
app.MapPost("/api/export", ExportData)
.RequireRateLimiting("Concurrency");

4. External API Calls

// Respect third-party limits
app.MapGet("/api/external/data", CallExternalApi)
.RequireRateLimiting("Concurrency");

5. Resource-Intensive Operations

// CPU, memory, network intensive
app.MapPost("/api/process", ProcessData)
.RequireRateLimiting("Concurrency");

❌ Avoid When

1. Fast Operations

// Simple CRUD (< 100ms)
app.MapGet("/api/users", GetUsers)
.RequireRateLimiting("FixedWindow"); // Better choice

// Why: Overhead not worth it for fast operations

2. Need Rate Limiting

// Prevent request floods
app.MapPost("/api/auth/login", Login)
.RequireRateLimiting("SlidingWindow"); // Better choice

// Why: Concurrency doesn't limit request rate

3. Stateless Operations

// Simple calculations, lookups
app.MapGet("/api/calculate", Calculate)
.RequireRateLimiting("TokenBucket"); // Better choice

// Why: No resource protection needed

Complete Example

Full Production Setup

// Program.cs
using Microsoft.AspNetCore.RateLimiting;

var builder = WebApplication.CreateBuilder(args);

// Add DbContext
builder.Services.AddDbContext<AppDbContext>(options =>
options.UseSqlServer(builder.Configuration.GetConnectionString("Default")));

builder.Services.AddRateLimiter(options =>
{
// Database operations: Match connection pool
options.AddConcurrencyLimiter("database", opt =>
{
opt.PermitLimit = 50; // DB pool size = 100, use 50
opt.QueueLimit = 100;
opt.QueueProcessingOrder = System.Threading.RateLimiting.QueueProcessingOrder.OldestFirst;
});
// File operations: Protect disk I/O
options.AddConcurrencyLimiter("files", opt =>
{
opt.PermitLimit = 5;
opt.QueueLimit = 20;
});
// Reports: CPU + Memory intensive
options.AddConcurrencyLimiter("reports", opt =>
{
opt.PermitLimit = 2;
opt.QueueLimit = 10;
});
// WebSocket: Connection limit
options.AddConcurrencyLimiter("websocket", opt =>
{
opt.PermitLimit = 1000;
opt.QueueLimit = 100;
});
// Comprehensive rejection handler
options.OnRejected = async (context, ct) =>
{
var logger = context.HttpContext.RequestServices
.GetRequiredService<ILogger<Program>>();
var endpoint = context.HttpContext.GetEndpoint()?.DisplayName ?? "Unknown";
logger.LogWarning(
"Concurrency limit exceeded: Endpoint={Endpoint}, Path={Path}",
endpoint, context.HttpContext.Request.Path);
context.HttpContext.Response.StatusCode = 429;
await context.HttpContext.Response.WriteAsJsonAsync(new
{
error = "concurrency_limit_exceeded",
message = "Maximum concurrent operations exceeded",
details = new
{
endpoint = context.HttpContext.Request.Path.Value,
strategy = "concurrency",
hint = "This endpoint has limited concurrent capacity. Please retry."
}
}, ct);
};
});

var app = builder.Build();
app.UseRateLimiter();

// Database-heavy endpoint
app.MapGet("/api/reports/analytics", async (AppDbContext db) =>
{
await Task.Delay(2000); // Simulate heavy query
var data = await db.Reports
.Include(r => r.Details)
.ToListAsync();
return Results.Ok(data);
})
.RequireRateLimiting("database");

// File upload endpoint
app.MapPost("/api/files/upload", async (IFormFile file) =>
{
var path = Path.Combine("uploads", Guid.NewGuid() + Path.GetExtension(file.FileName));
using var stream = File.Create(path);
await file.CopyToAsync(stream);
return Results.Ok(new { path });
})
.RequireRateLimiting("files")
.DisableAntiforgery();

// Report generation endpoint
app.MapPost("/api/reports/generate", async (ReportRequest request) =>
{
// Simulate slow report generation
await Task.Delay(30000);
return Results.Ok(new { report = "generated" });
})
.RequireRateLimiting("reports");

// WebSocket endpoint
app.Map("/ws", async (HttpContext context) =>
{
if (context.WebSockets.IsWebSocketRequest)
{
var ws = await context.WebSockets.AcceptWebSocketAsync();
await HandleWebSocket(ws);
}
})
.RequireRateLimiting("websocket");

app.Run();

async Task HandleWebSocket(WebSocket ws)
{
var buffer = new byte[1024];
while (ws.State == WebSocketState.Open)
{
await ws.ReceiveAsync(buffer, CancellationToken.None);
}
}

record ReportRequest(string Type);

Combining Strategies

// Combine Concurrency + Rate Limiting

// Step 1: Limit concurrent operations
app.MapGet("/api/reports", GenerateReport)
.RequireRateLimiting("concurrency");

// Step 2: Also limit rate per user
app.MapGet("/api/reports", GenerateReport)
.RequireRateLimiting("concurrency")
.RequireRateLimiting("per-user"); // Chained!

// Result: Both limits enforced
Share this lesson: