Azure Container Apps in 2025: A Comprehensive Guide to the Latest Features and Innovations
Azure Container Apps (ACA) has emerged as one of Microsoft’s most rapidly evolving cloud services, solidifying its position as the preferred platform for hosting AI workloads, intelligent agents, and modernized enterprise applications. Throughout 2025, Microsoft has delivered a remarkable series of enhancements at both Build and Ignite conferences, transforming ACA into a comprehensive serverless container platform that balances developer simplicity with enterprise-grade capabilities.
Serverless GPUs Now Generally Available in 11+ Regions
Azure Container Apps serverless GPU support has reached general availability and expanded to 11 additional regions, making GPU-powered AI workloads more accessible globally. Serverless GPUs offer capabilities such as automatic scaling with NVIDIA A100 or T4 GPUs, per-second billing, and strict data isolation within container boundaries. This feature allows development teams to focus more on innovation and less on infrastructure management.
The serverless GPU architecture provides several key benefits. It offers scale-to-zero capabilities where resources dynamically scale based on demand, reducing idle costs. Per-second billing ensures you only pay for compute time actually used, while built-in data governance keeps your data within container boundaries. You also get flexible compute options between NVIDIA A100 for compute-intensive ML scenarios and T4 for real-time inferencing.
NVIDIA NIM Integration
ACA now provides seamless integration with NVIDIA Inference Microservices (NIMs), which are optimized, containerized AI inference microservices that simplify and accelerate AI application development. These models are pre-packaged, scalable, and performance-tuned for direct deployment as secure endpoints. When combined with serverless GPUs, you can run NIMs efficiently without managing underlying infrastructure.
Dedicated GPUs Generally Available
Dedicated GPUs have reached general availability at Build 2025, simplifying AI application development and deployment by reducing management overhead. This offering includes built-in support for key components like the latest CUDA driver, turnkey networking, and security features, allowing developers to focus exclusively on their AI application code.
Serverless GPU in Dynamic Sessions (Early Access)
A groundbreaking feature now in early access enables running untrusted AI-generated code at scale within compute sandboxes protected by Hyper-V isolation. This feature supports a GPU-powered Python code interpreter specifically designed for AI workloads.
Azure AI Foundry Models Integration
Azure Container Apps now provides an integration with Foundry Models, allowing you to deploy ready-to-use AI models directly during container app creation. This integration supports serverless APIs with pay-as-you-go billing, managed compute with pay-per-GPU pricing, and flexible deployment options for Foundry models.
Docker Compose for Agents (Public Preview)
One of the most exciting announcements at Ignite 2025 is Docker Compose for Agents support, now in public preview. This feature makes it remarkably easy for developers to define agentic applications in a stack-agnostic manner with MCP and custom model support.
To deploy using Docker Compose for agents, you can use the az containerapp compose create command, which translates agent-focused compose elements into appropriate Azure Container Apps resources.
Dynamic Sessions with Shell Environment and MCP Support (Public Preview)
ACA now supports dynamic shell sessions with integrated MCP server capabilities. These shell sessions are platform-managed built-in containers designed to execute common shell commands within an isolated, sandboxed environment.
Rule-Based Routing Now Generally Available
Rule-based routing has reached general availability at Ignite 2025. This feature allows you to direct incoming HTTP traffic to different apps within your Container Apps environment based on the requested hostname or path.
With rule-based routing, you can create a fully qualified domain name (FQDN) on your container apps environment and route requests to different container apps depending on the path of each request. The feature supports custom domains alongside path-based routing and eliminates the need for a separate reverse proxy like NGINX. This capability simplifies architectures for microservice applications, A/B testing, and blue-green deployments without requiring additional infrastructure.
Premium Ingress Now Generally Available
Premium Ingress support is now generally available, introducing environment-level ingress configuration options. The primary highlight is customizable ingress scaling, which supports scaling of the ingress proxy to better handle higher demand workloads like large performance tests.
Private Endpoints Generally Available
Private Endpoints for Azure Container Apps, now generally available, allow customers to connect to their Container Apps environment using a private IP address in their Azure Virtual Network. This eliminates exposure to the public internet and secures access to applications. Additionally, customers can connect directly from Azure Front Door to their workload profile environments over a private link instead of the public internet.
Aspire Dashboard Now Generally Available
The .NET Aspire Dashboard in Azure Container Apps is now generally available. This dashboard provides live data about your projects and containers in the cloud to evaluate performance and debug errors with comprehensive logs, metrics, and traces.
The latest update supports Aspire version 13, which includes new visualization features for app resources, the ability to pause and resume telemetry, and enhanced performance monitoring capabilities. The Aspire Dashboard displays data from two sources: OpenTelemetry (traces, metrics, logs) and the Kubernetes API for pod information.
OpenTelemetry Collector Generally Available
The OpenTelemetry agent in Azure Container Apps is now generally available. This managed agent allows developers to use open-source standards to send app data without setting up the collector themselves.
The managed agent collects and exports telemetry data to Azure Monitor Application Insights, Datadog, and any generic OTLP-configured endpoint.
New Diagnose and Solve Dashboard
The new Diagnose and Solve dashboard provides a comprehensive overview of app health, performance, and resource utilization. It offers insights into apps, jobs, replicas, node count, and CPU usage over time, along with new detectors to diagnose and resolve issues such as container create failures, health probe failures, and image pull failures.
Azure SRE Agent Integration
Azure Container Apps integrates seamlessly with the Azure SRE agent to enhance operational efficiency and application uptime. The SRE agent continuously monitors application health and performance, providing valuable insights and autonomously responding to production alerts. This integration enables monitoring of Container Apps resources from environments to apps to revisions to replicas.
Azure Container Apps on Arc-Enabled Kubernetes (GA)
The ability to run Azure Container Apps on your own Azure Arc-enabled Kubernetes clusters (AKS and AKS-HCI) is now generally available. This allows developers to leverage Container Apps features while IT administrators maintain corporate compliance by hosting applications in hybrid environments.
Supported scenarios include running ACA on on-premises or cloud Kubernetes clusters, maintaining corporate compliance through hybrid hosting, and using the same developer experience across environments.
Planned Maintenance Now Generally Available
Planned Maintenance for Azure Container Apps is now generally available. This feature allows you to control when non-critical updates are applied to your environment, helping minimize downtime and impact on applications. Critical updates are still applied as needed to ensure security and reliability compliance.
Durable Task Scheduler (GA)
Durable Task Scheduler support is now generally available on Azure Container Apps. This provides a robust pro-code workflow solution where you can define reliable, containerized workflows as code with built-in state persistence and fault-tolerant execution.
Key benefits include Azure-managed dedicated resources with orchestration and entity state management built in, high performance for managing orchestration and task scheduling, scalability that auto-scales container app replicas using a built-in scaler, and simplified monitoring with a built-in dashboard for tracking workflow progress and activity durations.
Flexible Workload Profile (Public Preview)
Azure Container Apps introduced a Flexible workload profile in public preview. This new option combines the simplicity of serverless Consumption with the performance and control found in Dedicated profiles.
The Flexible workload profile offers a familiar pay-per-use model like Consumption profiles, enhanced features including scheduled maintenance and dedicated networking, larger replica support for demanding application needs, a dedicated compute pool for better predictability and isolation, and no extra operational complexity while maintaining control over performance.
Confidential Computing (Public Preview)
Confidential computing support is now available in public preview. This feature offers hardware-based Trusted Execution Environments (TEEs) to secure data in use by encrypting memory and verifying the cloud environment before processing.
Confidential computing helps protect sensitive data from unauthorized access, including access from cloud operators, making it particularly useful for organizations with high security and compliance requirements.
Deployment Labels (Public Preview)
Deployment labels are now available in public preview. This feature allows you to assign meaningful names like dev, staging, or prod to container revisions, which can be automatically assigned.
With deployment labels, you can simplify environment management, support advanced deployment strategies such as A/B testing and blue-green deployments, route traffic based on labels instead of changing revision names, promote or demote revisions by reassigning labels, and roll back to previous revisions using label history.
Native Azure Functions Support
A new, streamlined method for running Azure Functions natively in Azure Container Apps allows customers to leverage the full features and capabilities of Container Apps while benefiting from auto-scaling provided by Azure Functions.
The Road Ahead
Azure Container Apps continues to redefine how developers build and deploy intelligent agents and cloud-native applications. As Microsoft stated at Ignite 2025, agents deployed to Azure Container Apps with Microsoft Agent Framework and OpenTelemetry can be plugged directly into Microsoft Foundry, providing teams a single pane of glass for their agents in Azure.
The platform’s commitment to serverless scale, GPU-on-demand, and enterprise-grade isolation positions it as the ideal foundation for hosting AI agents securely and cost-effectively. With ongoing enhancements to developer tools, networking capabilities, and AI integrations, Azure Container Apps remains at the forefront of the containerized application landscape.
For developers and architects looking to modernize workloads or build new cloud-native applications, Azure Container Apps offers a comprehensive, fully-managed platform that removes infrastructure complexity while providing the flexibility and power needed for modern AI-driven applications. Whether you’re migrating legacy Java and .NET applications, building multi-agent AI systems, or deploying GPU-accelerated workloads, ACA provides the tools and capabilities to succeed in 2025 and beyond.
Last modified on 2025-11-26