Morningstar analyst Brian Colello has kept his “three star” rating on Nvidia with an unchanged fair value estimate of US$130 ($173.16) after attending the tech company’s GPU Technology Conference (GTC) earlier this week. GPU refers to graphics processing unit.
“[We] remain impressed with the company’s march toward artificial intelligence supremacy in hardware, software, and networking, all with physical artificial intelligence (AI) via robotics and autonomous driving on the horizon,” Colello writes in his March 19 report.
He adds that the company’s three-year artificial intelligence (AI) GPU roadmap is also impressive. Nvidia is slated to release the Blackwell Ultra (GB300 series) later this year, Vera Rubin in the second half of 2026 and Rubin Ultra in the second half of 2027.
“Rubin Ultra is expected to have 576 GPU die within a single NVLink data centre rack and should emerge as a workhorse with significant AI inference processing advantages versus prior generations,” he continues.
During the conference, the analyst shares that there was “little pessimism” around AI data centres as the Nvidia team noted the “massive” AI factories that were going to be built by governments as well as tech and consumer internet leaders.
Nvidia’s CEO Jensen Huang also stated that the “vast majority” of AI inference runs on Nvidia today. As such, despite the rising competition in the form of custom ASICs designed by hyperscalers, Colello believes that there will still be “tremendous demand” for Nvidia’s solutions in the years ahead. ASICs refer to application-specific integrated circuits. These are chips that are custom-built for specific tasks or applications and are designed to handle specific AI workloads.
With Nvidia relying on an ongoing increase in AI capital expenditures among tech leaders for its medium-term revenue, the analyst sees that the company’s “impressive roadmap” should see returning customers spending on AI and replacing their legacy GPUs.
Another takeaway is Nvidia’s seeing robotics and physical AI as an opportunity for data centres.
In his report, Colello notes that Nvidia “needs to build the AI to build the robots”. The company also sees the demand for robotics supporting ongoing data centre spending.
See also: Meituan price targets are among highest in China tech
At the conference, Nvidia unveiled its Isaac GROOT N1 robotics foundational model, although the open-source model came as a surprise to Colello.
“However, we think Nvidia is trying to seed the robotics ecosystem and instead profit on the back end via cloud data and workloads,” says the analyst.
Other points of note include Huang suggesting that Dell’Oro’s estimate of AI infrastructure spending to surpass US$1 trillion by 2028 might be “conservative” since it doesn’t include robotics or autonomous driving.
“Also, given Nvidia’s share gains within this bucket of spending in recent years, it’s quite possible that Nvidia’s portion of data centre spending may rise, in addition to the ongoing rising total addressable market (TAM),” says Colello.
“One illuminating slide disclosed that, at its peak (presumably within calendar 2024), Nvidia shipped 1.3 million Hopper (H100/H200) products to the four largest cloud service providers in the US (i.e., Microsoft Azure, Google Cloud, Amazon AWS, Oracle Cloud),” he adds. “Yet for the next 12 months, these customers have already ordered 1.8 million Blackwell products (3.6 million GPU die in total, since two die are now packaged within one Blackwell product).”
“This disclosure bodes well for ongoing GPU spending in calendar 2025 (i.e., most of Nvidia’s fiscal 2026), but we believe we’ve captured such spending in our estimates to date,” he continues.
With all these in mind, including anecdotes from Nvidia’s customers on beginning their AI journey, Colello believes that Nvidia “sits at the heart of the AI ecosystem and has good visibility into the trends where AI is headed”.
For more stories about where money flows, click here for Capital Section
“Combined with the two-year-plus lead times to build out AI data centres, we think Nvidia has a particularly keen sense of where AI is headed,” he says. “In turn, we think this wide-moat company with competitive advantages in hardware, software, and networking should remain the lynchpin of the AI ecosystem.”
As at Colello’s report, Nvidia’s shares, which last traded at US$115.43 on March 18, are deemed to be “fairly valued”.
On March 20, CEO Jensen Huang told Financial Times that the company intends to invest hundreds of billions of dollars in US-made chips and electronics over the next four years. According to the article, Nvidia estimates it will spend some US$500 billion on electronics over the same period.
Shares in Nvidia closed US$2.09 higher or 1.81% up at US$117.82 on March 19. Shares in the company are now up to US$118.79 after hours.