It was the most-watched earnings announcement of the year. In Manhattan, retail investors gathered in bars to cheer the performance of one of the world's largest companies.
On Feb 26, chip giant Nvidia Corp unveiled its earnings for the quarter and the fiscal year ended January. The artificial intelligence (AI) chip pioneer handily beat expectations and raised earnings guidance for its current fiscal year. The chip firm beat its quarterly revenue estimates by over US$1.2 billion ($1.6 billion), up 12% over the previous quarter, or an annualised 78%, to US$39.3 billion, fuelled by insatiable AI demand. Yet, Nvidia has been growing at such a giddying pace — sales have grown fivefold over the past two years to US$130 billion — that investors fretted that the huge beat was its smallest since early 2023. Nvidia's shares fell nearly 16% over the next three trading days before recovering slightly. They are still down 22% from their early January peak.
Revenues from data centres, which now account for 90% of its revenues, grew 16% over the past quarter to US$35.6 billion, or at a US$140 billion annual run rate. Automotive sector sales — mostly to makers of semi-autonomous vehicles or driverless vehicles — grew 27% over the previous quarter to US$600 million, revenues from chips used in professional visualisation, including augmented and virtual reality applications, grew 5% over the quarter to over US$500 million while sales to other sectors grew 30% in the quarter to US$100 million. The only laggard was its legacy video-gaming chips business, which saw sales decline by 22% over the quarter to US$2.5 billion. Nvidia's business — mainly chips, embedded CUDA software and services — had gross margins of 73% in the last quarter, down from 75% in the previous quarter.
Main growth drivers
The main driver of Nvidia's profit machine until the middle of last year was its Hopper graphic processing unit or GPU used by hyperscalers like software supremo Microsoft, search giant Google's parent Alphabet, e-commerce pioneer and cloud services behemoth Amazon.com, and social media leader Meta Platform to train large language models or LLMs. Over the past eight months or so, AI has been moving from mere training LLMs to inferencing and, more recently, reasoning. In January, Chinese AI start-up DeepSeek unveiled an AI reasoning model ostensibly using a fraction of the AI chips that large US hyperscalers needed to do the task. Detractors say DeepSeek used far more chips than it has been willing to admit because many of them were Nvidia's top-of-the-line semiconductors, whose exports to China are restricted. Since then, it has been revealed that many Chinese companies, including DeepSeek, may have gained access to Nvidia chips that were exported to Singapore, Malaysia and other Southeast Asian countries.
On Feb 27, Singapore charged three men — two Singaporean top executives of Aperia, Alan Wei Zhaolun, 48, Aaron Woon Guo Jie, 40, and Li Miang, 53, a Chinese national — with fraud. The men are suspected of exporting Nvidia chips installed in Dell Technologies Inc and Super Micro Computer Inc servers to Malaysia, potentially breaching US export controls. Although more than 20% of Nvidia's total AI chip sales are invoiced through its Singapore office, less than 1% of the chips are sold to Singapore firms.
See also: Foxconn’s mega-AI plant ready in a year despite Trump tariffs
DeepSeek's revelations came just as Nvidia was ramping up production of its new Blackwell chips. Nvidia sold US$11 billion of Blackwell chips in the last quarter. Investors have also been worried that AI development might be moving away from training and inferencing on Nvidia's Hopper and Blackwell chips to other cheaper AI models. Whether DeepSeek used Nvidia's state-of-the-art chips or relied on the firm's lower-end chips, including some of its gaming chips, for its reasoning model is moot. What is clear, however, is that Chinese start-ups and other AI firms have been able to find cheaper workarounds.
On March 7, two Chinese firms firmed unveiled new AI breakthroughs. Alibaba Holdings released a new open-source AI reasoning model, QwQ32B, that rivals DeepSeek's R-1 model released in January. Another Chinese firm, an AI start-up called Manus, released a general AI agent that it claimed was capable of outperforming similar offerings from ChatGPT creator OpenAI. Unlike conventional AI models that focus on chat-based workflows, Manus operates autonomously, executing tasks across various domains, its co-founder Ji Yichao claimed in a YouTube video posted by his firm.
At last week's earnings call, Nvidia's CEO Jensen Huang argued that DeepSeek's approach will only increase long-term demand for Nvidia chips. As AI development moves from training to inferences to reasoning, its use will only spread, unleashing new demand for AI chips. "No technology has ever had the opportunity to address a larger part of the world's GDP than AI," Huang said last week. "No software tool ever has." DeepSeek's R1, he said, has helped "ignited global enthusiasm" and will push reasoning AI into even more compute-intensive applications.
See also: Nvidia slides after subdued growth fails to wow investors
Huang believes AI will herald the biggest economic shift in history. Unlike software revolutions of the last 20 years, which digitised existing workflows, the chip giant's founder notes that AI creates entirely new industries while reshaping every sector, from manufacturing to healthcare. "We've really only tapped consumer AI and search and some amount of consumer generative AI, advertising, recommenders, kind of the early days of software," Nvidia CEO said during the earnings call. "Future reasoning models can consume much more compute."
As he sees it, new demand for AI chips will outgrow hyperscaler demand as AI transforms both digital and physical industries. The Nvidia CEO laid out a three-layer AI transformation that has been unfolding across industries. First, there is agentic AI or enterprise AI, including Microsoft AI copilots and automation tools that software giants like Salesforce Inc use to enhance employee productivity in finance, healthcare, automotive and other industrial sectors.
AI will interact with the physical environment, enabling the next generation of automation. Think of agentic AI as a proactive, AI-powered agent that can act autonomously to make decisions and take actions to achieve a goal without explicit instructions. It represents a new generation of increasingly powerful foundation models that act as operating systems for autonomous, action-taking digital agents capable of enhanced reasoning and decision-making.
There is also physical AI, or artificial intelligence for machines, that helps AI-powered training systems for physical objects, from Amazon's robotic warehouses to driverless cars or robotaxis like Google's Waymo that are already on the roads in several US cities or Tesla's Cybercab that will begin commercial operations later this year. Nvidia's AI chip sales to the automotive sector are expected to grow to over US$5 billion this year. Lastly, robotic AI or AI-powered systems in the real world interact with and navigate physical environments, such as self-driving cars, humanoid robots, and commercial and industrial robots. The next five years are likely to be breakthrough years for robotics, thanks to AI.
Shortages and tariffs
For now, investors are focusing on the rising risks in the AI ecosystem. While Nvidia continues to rely on growing AI demand, the cyclical semiconductor sector has traditionally swung from feast to famine. Chip shortages have historically led to overbuilding of capacity and oversupply. More than two years after OpenAI first unveiled its ChatGPT chatbot, there is still an acute shortage of AI chips, but the current tight market could eventually lead to a glut of AI chips. Nvidia's biggest customers — Amazon, Microsoft, Google and Meta — are building their own customised ASICs (application-specific integrated circuits) or AI chips. Broadcom and Marvel Technology are helping hyperscalers build accelerator chips designed for a specific use — AI. While hyperscalers will rely on Nvidia for most of their AI chip demand, a shift to internal solutions could impact long-term supply-demand dynamics and depress Nvidia's margins. There are also geopolitical issues. Due to export restrictions, Nvidia's China sales have been cut by 50%. President Donald Trump's tariffs on semiconductors produced by Taiwan Semiconductor Manufacturing Co, or TSMC, are also likely to disrupt supply chains and squeeze margins.
As the White House moved to unveil tariffs on its three largest trading partners — Canada, Mexico and China, and the focus turned on tariffs on semiconductors as well as tighter export controls on AI chips, Nvidia's co-founder Huang laid out a new framework for AI's growing computing needs that highlight three key scaling laws. At the forefront is pre-training scaling, where AI models grow smarter by consuming more data. Multimodal learning integrating sensory inputs like speech, vision and touch, as well as reasoning-based data, are now enhancing this phase. There is also post-training scaling, where AI refines itself using reinforcement learning both from human and AI feedback. Increasingly, post-training requires even more computation than pre-training, as AI models generate vast amounts of synthetic training data. Finally, there is inference & reasoning scaling, where AI performs "long thinking" through techniques like chain-of-thought reasoning and search. Test-time computing, or inference, already demands 100 times more computing than early LLMs — and could eventually require millions of times more.
Sink your teeth into in-depth insights from our contributors, and dive into financial and economic trends
Nvidia has positioned Blackwell as the first GPU designed for this new AI paradigm. Blackwell's architecture is built to handle pre-training, post-training, and inference in a unified, flexible data centre environment. AI isn't just getting bigger — it is also thinking deeper. Each phase of AI development demands exponentially more computing, reinforcing why Nvidia hardware would still be in demand even as Chinese start-ups release new AI models and competition from Broadcom and Marvel, who are building customised AI chips for the hyperscalers, continues to grow.
Even as AI enters a new phase of growth, Nvidia's shares remain under pressure. While Blackwell's production ramp and AI's shift to reasoning models are strong tailwinds, the chip giant faces margin pressure, competition, and geopolitical risks. Tariff uncertainties are likely to impact future sales, with potential new US trade restrictions on AI chip exports. While the previous administration had curbed AI chip sales to China, Trump wants further tariffs on foreign-made semiconductors. Since Nvidia gets all its chips made in Taiwan, its supply chain could be severely impacted. Moreover, billionaire Elon Musk's Department for Government Efficiency or DOGE cuts, as well as the deportation of illegal immigrants, are likely to hurt consumption and growth, which might push some of its customers to delay chip purchases.
Still, consensus expectations of Nvidia's net profits for the current year have been revised to US$133 billion from US$74.3 billion the company made last year. The AI chip behemoth currently has net cash and cash equivalents of US$43.2 billion. Analysts expect its total net cash to grow to US$104.6 billion by the end of the current fiscal year and to a whopping US$221 billion by the end of next year. That's three times the net cash that tech giants Apple and Alphabet currently have on their own respective balance sheets.
At just under 26 times the current fiscal year's earnings, Nvidia's stock is now actually cheaper than the giant US retailers like Walmart, which is trading at 36.3 times 2025 earnings, or Costco, whose shares trade 52 times this year's earnings, according to Koyfin data. Indeed, Nvidia is cheaper than five of the other six Magnificent Seven tech giants. Apple trades at 31.2 times forward earnings, Microsoft at 29 times this year's earnings, Amazon at 33 times this year's earnings, Meta Platforms at 26.1 times and Tesla at 99 times earnings. Only Google's owner, Alphabet, whose stock trades at 22 times this year's earnings, is cheaper. Tariffs, DOGE cuts and slightly slower economic growth are unlikely to derail AI's march in the near future.
Assif Shameen is a technology and business writer based in North America