Skip to main content

.NET 9 Performance Improvements Every Developer Should Know

January 17, 2026 3 min read

.NET 9 delivers meaningful performance gains — teams have seen up to 22% reduction in P99 latency with zero code changes, just a target framework swap. Let's walk through the improvements that matter most.

Dynamic PGO: Enabled by Default

Dynamic Profile-Guided Optimization now runs by default in .NET 9. The JIT collects runtime profiling data and devirtualizes interface calls based on observed types — meaning your DI-heavy services get optimized automatically:

public class OrderProcessor
{
    private readonly IOrderValidator _validator;
    private readonly IInventoryService _inventory;

    public OrderProcessor(IOrderValidator validator, IInventoryService inventory)
    {
        _validator = validator;
        _inventory = inventory;
    }

    public async Task<OrderResult> ProcessAsync(Order order)
    {
        // PGO observes _validator is always ConcreteOrderValidator
        // at runtime and devirtualizes this call automatically
        var validation = await _validator.ValidateAsync(order);
        if (!validation.IsValid)
            return OrderResult.Failed(validation.Errors);

        await _inventory.ReserveAsync(order.Items);
        return OrderResult.Success(order.Id);
    }
}

In benchmarks, Dynamic PGO reduces method dispatch overhead by 15-30% on services with heavy interface usage. No code changes needed.

Native AOT: Production-Ready for Web APIs

.NET 9 closes the gap for Native AOT with improved System.Text.Json source generators, better trimming, and reduced binary sizes. Here are the startup benchmarks:

| Metric                | .NET 9 JIT  | .NET 9 AOT  | Improvement |
|-----------------------|-------------|-------------|-------------|
| Cold start            | 287ms       | 38ms        | 86% faster  |
| Memory (startup)      | 48 MB       | 14 MB       | 71% less    |
| Binary size           | N/A         | 18 MB       | -           |
| First request latency | 12ms        | 3ms         | 75% faster  |

For containerized microservices where cold start matters — think Kubernetes pod scaling — Native AOT is now a strong default choice.

Frozen Collections and New LINQ Methods

FrozenDictionary<TKey, TValue> received major optimizations in .NET 9. These immutable collections analyze your keys at construction time and select the optimal hashing strategy, outperforming Dictionary by 30-60% for read-heavy lookups:

public class FeatureFlagService
{
    private readonly FrozenDictionary<string, bool> _flags;

    public FeatureFlagService(IConfiguration config)
    {
        _flags = config.GetSection("FeatureFlags").GetChildren()
            .ToFrozenDictionary(
                x => x.Key,
                x => bool.Parse(x.Value ?? "false"),
                StringComparer.OrdinalIgnoreCase);
    }

    public bool IsEnabled(string featureName)
        => _flags.TryGetValue(featureName, out var enabled) && enabled;
}

New LINQ methods like CountBy and AggregateBy eliminate GroupBy allocations in hot paths, cutting memory allocations by up to 40% in data processing pipelines.

Key Takeaways

  1. Upgrade and measure first. Dynamic PGO alone gives most applications a measurable boost with zero code changes.
  2. Adopt FrozenDictionary for read-heavy lookups populated at startup.
  3. Evaluate Native AOT for new containerized microservices — the startup and memory characteristics are compelling.
  4. Replace GroupBy with CountBy/AggregateBy in hot paths for significant allocation reduction.
  5. Use SearchValues for repeated string scanning — input validation, log parsing, and protocol handling.
Share this post

Comments

Ajit Gangurde

Software Engineer II at Microsoft | 15+ years in .NET & Azure