Shares of Alphabet Inc, Google’s owner, gained as much as 2.7% in late trading on Monday, while Nvidia slumped 2.7% at one point.
An agreement would help establish TPUs as an alternative to Nvidia’s chips, the gold standard for big tech firms and start-ups from Meta to OpenAI that need computing power to develop and run AI platforms. Google previously sealed a deal to supply up to one million of the chips to Anthropic PBC. Still, Nvidia remains dominant in the market.
After the Anthropic deal was announced, Seaport analyst Jay Goldberg called it a “really powerful validation” for TPUs. “A lot of people were already thinking about it, and a lot more people are probably thinking about it now,” he said.
Representatives of Meta declined to comment, while Google didn’t immediately respond to requests.
See also: Musk’s xAI co-founder Babuschkin to buy London penthouse for £57 mil — Bloomberg
Asian stocks related to Alphabet surged in early Tuesday trading in Asia. In South Korea, IsuPetasys Co, which supplies multilayered boards to Alphabet, surged 18% to a new intraday record. In Taiwan, MediaTek Inc shares rose almost 5%.
A deal with Meta — one of the biggest spenders globally on data centres and AI development — would mark a win for Google. But much depends on whether the tensor chips can demonstrate the power efficiency and computing muscle necessary to become a viable option in the long run.
The tensor chip — first developed more than 10 years ago especially for AI tasks — is gaining momentum outside its home company as a way to train and run complex AI models. Its allure as an alternative has grown at a time companies around the world worry about an overreliance on Nvidia, in a market where even Advanced Micro Devices Inc is a distant runner-up.
See also: Roblox shares jump after users, bookings beat estimates
Graphics processing units, or GPUs, the part of the chip market dominated by Nvidia, were created to speed the rendering of graphics — mainly in video games and other visual-effects applications — but turned out to be well-suited to training AI models because they can handle large amounts of data and computations. TPUs, on the other hand, are a type of specialised product known as application-specific integrated circuits, or microchips that were designed for a discrete purpose.
The tensor chips were also adapted as an accelerator for AI and machine learning tasks in Google’s own applications. Because Google and its DeepMind unit develop cutting-edge AI models like Gemini, the company has been able to take lessons from those teams back to the chip designers. At the same time, the ability to customise the chips has benefited the AI teams.
Uploaded by Tham Yek Lee
