Amazon is reportedly developing custom artificial intelligence (AI) chips to reduce its dependence on NVIDIA and lower costs, according to the Financial Times. This move is part of a broader trend in big tech to lessen reliance on NVIDIA's dominant and expensive GPUs.
Amazon's chip design subsidiary, Annapurna Labs, acquired in 2015, is leading the effort. Annapurna already designs Amazon's Graviton processors for data centers, and their Trainium chips are designed for large language models. Trainium2, unveiled in 2023, is reportedly used by Anthropic, Amazon's AI partner and provider of the Claude AI model.
Reducing Costs and Dependence
Developing in-house AI chips allows Amazon to reduce both its reliance on NVIDIA and its expenses. NVIDIA's GPUs, while market-leading, are in high demand and come at a premium price. Amazon's strategy mirrors that of other tech giants like Google (with its Tensor Processing Units) and Meta (with its Meta Training and Inference Accelerator), who are also developing custom AI hardware.
Amazon is expected to reveal more details about its custom AI chips next month at an event focused on its Trainium lineup. While Trainium2 was launched last year, supply constraints have limited its availability.
The chips are designed using technology from Alchip and manufactured by TSMC, the same company that produces chips for many leading tech companies. Amazon has stated that over 50,000 AWS customers are already utilizing its Graviton chips, showcasing the potential for its in-house silicon.