A quiet revolution is reshaping business operations—the cloud-native journey. This transformative shift, powered by the rise of artificial intelligence (AI) and containerised applications, is redefining infrastructure operations and helping businesses to innovate in new ways.
Gartner predicts 95% of new digital workloads will be deployed on cloud-native platforms by this year. This highlights the urgency for organisations to embrace cloud-native strategies.
The 2025 Enterprise Cloud Index (ECI) report by Nutanix echoes this, revealing that 94% of global respondents agree that their organisation benefits from adopting cloud-native applications or containers. In the Asia-Pacific-Japan region, 80% of organisations have already containerised some of their applications, while 16% are in the process of containerising their applications—highlighting a clear momentum toward modern infrastructure adoption.
However, as organisations race to adopt cloud-native strategies, they encounter a host of challenges that require careful navigation. AI workloads demand robust infrastructure uptime, and businesses must overcome the difficulties of managing containerised applications at scale in real-time.
How can businesses successfully navigate the dual demands of cloud-native adoption and AI integration? It seems like unlocking the potential of cloud-native solutions and AI capabilities seamlessly is unrealistic, let alone idealistic.
The twin imperatives: Mastering cloud-native and AI complexity
See also: Is your cloud more fortress or folly? Reinforcing cloud security in the digital age
Infrastructure modernisation projects struggle to support the scale of modern workloads, creating bottlenecks. The broader industry shift to integrate AI adds additional demands such as high-performance computing, real-time inference, and large-scale data processing. Legacy infrastructure cannot meet these emerging needs while addressing critical security and compliance specifications.
These challenges are interdependent. AI workloads amplify the need for secure and scalable Kubernetes environments, while hybrid deployments demand infrastructure flexibility. Gartner estimates that 90% of organisations will run containerised applications in production by 2027. Businesses must act quickly to modernise but the incentives and impetus to do so are lacking. Complexity is the cause. Addressing these dual needs requires platform designs that fulfil all essential needs—simplify operations, enhance security, and support seamless hybrid multi-cloud integrations.
Unifying modern and legacy infrastructure
See also: The next milestone for AI: Building a global trustworthy AI ecosystem
The platform strategy for the future must ensure application and data portability across computing environments. This flexibility allows organisations to support workloads across cloud environments. This is particularly relevant in the Asia Pacific region, where nearly 90% of enterprises run workloads across multiple public cloud providers.
Another significant challenge organisations face is maintaining their virtual machine workloads —especially for legacy applications—while simultaneously building modern app infrastructure using Kubernetes and cloud-native infrastructures. As such, simplifying operations is non-negotiable. Kubernetes is a massive undertaking. Managing Kubernetes and containerised applications at scale requires robust automation and lifecycle management tools to reduce operational overhead and stretching teams thin.
By 2027, two-thirds of cloud applications will use AI. It is anticipated that up to 80% of organisations in Asia will struggle to find skilled professionals to manage and develop these cloud applications. As part of this shift, the forecasted labour crunch and skill gap underscores the importance of automation and intuitive platforms that simplify AI and cloud-native computing, helping businesses to scale without rely solely on limited expertise.
To successfully navigate transition, three dissimilar infrastructures need to come together: virtual machines, Kubernetes, and cloud-native infrastructure. Value is created when one can provide all of these from one integrated platform, simplifying the cloud-native journey. By enhancing developer experiences through self-service capabilities, platform engineers can ensure efficient operations even in complex cloud-native environments.
Architecting your transformation journey
A clear roadmap that aligns technology initiatives with strategic business goals is key to successfully navigating the cloud-native journey. Organisations should start by focusing on high-impact workloads to achieve quick wins and build stakeholder confidence. Containerising specific AI applications can reduce development cycles and improve scalability. An iterative approach allows businesses to refine strategies based on immediate-term outcomes, ensuring momentum for smoother transitions.
Stable scalability is critical. Platforms that support seamless scaling options for both cloud-native and AI workloads give the flexibility needed to adapt to changing market demands. Investing in scalable infrastructure helps businesses stay competitive in their markets.
To stay ahead of the latest tech trends, click here for DigitalEdge Section
In relation, adopting hybrid cloud strategies creates new options, enabling organisations to deploy workloads in optimal environments. Data-intensive AI tasks may benefit from on-premises resources, while less critical applications can take advantage of public cloud services for cost efficiencies. The Asia-Pacific and Japan region’s leadership in cloud-native services underscores the importance and value of hybrid platform strategies.
Modern platforms with integrated security, automation, and data services streamline the transition to cloud-native architectures, optimising operations and enabling sustained innovation. By using tested frameworks and vendor-verified solutions, businesses can minimise risks and cut down deployment timelines.
Seizing the cloud-native advantage
The cloud-native journey, with the rise of AI in everyday computing, offers a pivotal opportunity for businesses to innovate and scale. However, this success requires more than technology adoption alone—the advancements must align with the broader strategic goals of the business and address existing infrastructure limitations, operation complexities, and the unique demands of AI workloads.
By embracing platforms that simplify operations, enhance security, and support hybrid multi-cloud environments, organisations can unlock the full potential of cloud-native architectures. Those who act now will not only transform workloads but also future-proof operations, gaining a competitive advantage in the dynamic digital economy we’re witnessing.
Daryush Ashjari is the chief technology officer and vice president - Solution Engineering, APJ at Nutanix