/dq/media/media_files/xFoTM4IA4xOibJRZaKXG.webp)
NVIDIA still leads in the AI chip race, but its global AI infrastructure push recently may reflect concerns that hardware growth may slow—and that CSPs ramping up ASIC development could become serious challengers. According to Commercial Times, as Google, Microsoft, and Meta ramp up in-house ASIC development, NVIDIA’s grip on the AI accelerator market could face a turning point by 2027.
Notably, Taiwanese ASIC firms like Alchip and TSMC-affiliated GUC are riding the wave of growing demand, collaborating with AWS and Microsoft on custom chip designs, as highlighted by the report. Here’s a quick look at the latest ASIC moves by cloud giants and their key design partners.
Google reportedly leads with TPU v6 Trillium
Among U.S. CSP giants, Google leads with its TPU v6 Trillium, which offers improved energy efficiency and performance for large-scale AI models, according to TrendForce. Google has also expanded from a single-supplier model (Broadcom) to a dual-sourcing strategy by partnering with MediaTek, as per TrendForce.
Commercial Times notes that Google deployed a massive cluster of 100,000 TPU v6 chips to run its Gemini 2.0 model—and in terms of overall cost and performance, it reportedly matches or even beats NVIDIA’s GPU-based solutions.
AWS runs in-house AI chips at Anthropics
Meanwhile, AWS is making strides in custom chips too—CNBC reports that Anthropic’s Claude Opus 4 runs on AWS’s Trainium2, with Project Rainier using over 500,000 of them. These are workloads that once would’ve gone to NVIDIA, the report adds.
According to CNBC, AWS is set to launch Trainium3 later this year, promising twice the performance of Trainium2 and 50% greater energy efficiency.
Notably, at its 2025 AI Investor Day on June 17, AWS ASIC partner Marvell revealed that custom chips made up over 25% of its Q4 FY25 data center revenue—and are expected to surpass 50% in the future.
As per TrendForce, AWS continues to focus on Trainium v2, co-developed with Marvell, which is designed for generative AI and LLM training. The company is also working with Taiwan’s Alchip on Trainium v3, as noted by TrendForce.
Meta, Microsoft, and OpenAI’s roadmap to fall 2026
The 2027 timeline cited by Commercial Times appears to roughly align with CSPs’ product roadmaps. According to supply chain sources cited by the report, Meta plans to launch its in-house MTIA T1 training chip in early 2026, built on TSMC’s 3nm process with HBM3e, and developed with support from Broadcom.
Meanwhile, Commercial Times reports that Microsoft’s Maia 200, like Meta’s chip, is expected to use TSMC’s 3nm process, and is reportedly co-developed with TSMC-affiliated GUC. The chip is rumored to hit the market around 2026.
Even ChatGPT maker OpenAI is betting on custom chips, as it is also developing its own AI training processor, expected to debut in Q4, according to Commercial Times. Reuters suggested that OpenAI is on track to begin mass production at TSMC by 2026.
Source: TrendForce, Taiwan.