Google, NVIDIA
Digest more
Google has announced new Tensor processors for its data centers, claiming they help reduce energy usage significantly.
The Chosun Ilbo on MSN
Google's specialized 8th-gen TPUs challenge NVIDIA
Google has unveiled its new in-house artificial intelligence (AI) chip, the ‘8th-Generation Tensor Processing Unit (TPU).’ For the first time, the company has separated the chip into two dedicated versions—one for AI training and one for inference—and released both simultaneously.
Most of the companies that have fully committed to building AI models are gobbling up every Nvidia AI accelerator they can get, but Google has taken a different approach. Most of its cloud AI infrastructure is based on its line of custom Tensor processing units (TPUs).
Google is packing ample amounts of static random access memory into a dedicated chip for running artificial intelligence models, following Nvidia's plans.
At Google Cloud Next ‘26, the company unveiled two AI chips, each tailored specifically for training and inference.