Google Cloud’s Tensor Processing Units (TPUs) have established a significant lead over competitors in the AI hardware sector, according to a new report by Omdia. The research report, ‘Checking in with hyperscalers’ AI chips: Spring 2024,’ indicates that Google is set to ship around $6 billion worth of TPUs to its data centers in 2024.These TPUs support Google’s internal projects like Gemini, Gemma, and Search, as well as customer workloads on the Google Cloud Platform.Despite the major cloud hyperscalers – Google Cloud, Microsoft Azure, Amazon Web Services, and Meta Platforms – developing custom AI accelerators, Google Cloud’s TPUs would appear to outperform the others.Profitability of the Google Cloud PlatformOmdia’s Principal Analyst for Advanced Computing, Alexander Harrowell, suggests that the success of Google’s TPUs may be contributing to the profitability of the Google Cloud Platform. He also notes the expanding ecosystem of semi-custom chip providers, such as Broadcom, Marvell, Alchip, and Arm plc’s Neoverse CSS service, which is bolstering industry trends toward custom silicon.However, the report also points to an intriguing development: the emergence of ‘Customer C,’ a U.S.-based cloud computing company with a new AI chip expected to debut in 2026. Harrowell notes that the lengthy lead time for this chip project suggests it represents a novel innovation.Omdia, a part of Informa Tech, provides technology research and advisory services. Their insights help organizations make informed decisions to drive growth in the tech market.