Google launches AI chip to handle faster training performance of LLMs

Google has introduced Cloud TPU v5e, claiming it is the most cost-efficient and versatile TPU to date. The new tensor processing unit aims to address the growing demand for computing infrastructure that can handle workloads like generative AI and LLMs. Google states that Cloud TPU v5e offers up to 2x higher training performance per dollar and up to 2.5x inference performance per dollar compared to its predecessor.

from Gadgets News – Latest Technology News, Mobile News & Updates https://ift.tt/PTf4Bnc
via

No comments:

Post a Comment