Amazon is intensifying its efforts to develop custom artificial intelligence (AI) chips, aiming to provide more cost-effective and efficient alternatives to Nvidia’s offerings. This initiative seeks to reduce Amazon’s reliance on Nvidia and address the growing demand for affordable AI processing solutions.

In his 2025 annual shareholder letter, Amazon CEO Andy Jassy highlighted the high costs associated with AI chips, primarily due to dependence on a single provider—implicitly Nvidia. He emphasized Amazon’s commitment to making AI more accessible and affordable through investments in custom chips like Trainium and Inferentia. These chips are designed to offer improved price-performance ratios, with some cases showing up to 40-50% cost reductions compared to Nvidia’s models.
Amazon’s in-house chip development is spearheaded by Annapurna Labs, acquired in 2015. The company has been working on its Graviton processors for nearly a decade, reaching their fourth generation. The newer AI-focused chips, Trainium and Inferentia, are part of Amazon’s strategy to provide customers with more affordable AI processing options.
The demand for AI processing power remains robust, as evidenced by Taiwan Semiconductor Manufacturing Company’s (TSMC) 46.5% year-over-year sales increase in March. Companies like Amazon and Alphabet are reaffirming their financial commitments to AI, with Amazon investing aggressively in AI capabilities, data centers, and satellite connectivity through projects like Project Kuiper.
By developing its own AI chips, Amazon aims to lower costs and enhance performance for its Amazon Web Services (AWS) customers. This move not only reduces dependence on Nvidia but also positions Amazon competitively in the rapidly evolving AI landscape. As AI continues to transform industries, Amazon’s focus on cost-effective solutions is poised to drive broader adoption and innovation.