Introduction
In a recent announcement, Foxconn revealed that its servers featuring Nvidia much-anticipated Blackwell GPUs will be available in limited quantities this year.
Foxconn, the primary manufacturer of Nvidia’s high-end GPUs, has officially confirmed that the release of Blackwell GPUs in 2024 will be limited. This news comes amidst an unprecedented surge in demand for AI servers, which heavily rely on these powerful graphics processors.
The news underscores ongoing challenges in the semiconductor industry, particularly concerning the latest advancements in GPU technology.
While Foxconn’s initial rollout will be constrained, the company remains optimistic about meeting its sales targets through a strong demand for AI servers and existing Nvidia GPU products.
Follow us on Twitter here
Key Takeaways
Limited Initial Supply: Foxconn will release Nvidia Blackwell GPUs in limited quantities this year, with an increase in production expected in early 2025.
Strong Market Demand: The company’s AI server segment represents 40% of its operations, with significant demand for Nvidia’s H100 and H200 GPUs supporting its revenue targets.
Technical Challenges: Nvidia faces technical issues with Blackwell GPUs, particularly related to the new CoWoS-L packaging technology, potentially causing delays in production.
Future Plans: Nvidia is working on the B200A GPU to meet the need for lower and mid-range AI systems, expected to launch in Q2 2025 with enhanced memory and bandwidth capabilities.
Intel Sells Stake in Arm Holdings: What This Means for Semiconductor Industry – techovedas
Foxconn Announces Limited Initial Supply of Nvidia Blackwell GPUs
Foxconn confirmed it will release its first batch of servers with Nvidia’s Blackwell GPUs this year, though only in small quantities.
The company will start shipping these new AI servers in the last quarter of 2024. Production volumes are expected to increase in early 2025.
James Wu, Foxconn’s spokesperson, shared this information during an earnings call with analysts and investors, as reported by Nikkei.
$7.93 Billion: TSMC Reports Highest Ever Monthly Revenue for July 2024 in History – techovedas
High Demand for AI Servers Boosts Foxconn’s Confidence
Foxconn’s investment in the AI server market is substantial, with this sector now representing 40% of its overall server operations.
Foxconn holds a 40% share of the global AI server market and predicts this sector will soon be worth a trillion dollars.
This confidence comes from the strong demand for Nvidia’s H100 and H200 GPUs, which are performing well in the market.
Even with the limited availability of Blackwell GPUs, Foxconn believes it will hit its revenue goals for the second half of the year.
The company plans to leverage its existing Nvidia Hopper-powered servers to meet the high demand for AI infrastructure.
Technical Challenges Affect Nvidia Blackwell GPU Supply
The constraints on Blackwell GPU availability are attributed to several technical challenges. Nvidia’s Blackwell GPUs, including the B100 and B200 models, utilize TSMC’s innovative CoWoS-L packaging technology.
This packaging involves a redistributed chiplet layout (RDL) interposer with local silicon interconnect (LSI) bridges.
Accurate placement of these bridges is crucial for maintaining the 10 TB/s interconnect speed between compute dies.
However, Nvidia is reportedly facing design issues. There are concerns about mismatches in the coefficient of thermal expansion (CTE) among the GPU chiplets, LSI bridges, RDL interposer, and motherboard substrate.
These issues may lead to warping and failure of the system-in-packaging. Analysts suggest that Nvidia may need to redesign the top metal layers and bumps of the Blackwell GPU silicon, potentially causing further delays.
Expected Production and Availability Timeline
Nvidia faces manufacturing challenges. Nvidia will produce a limited number of 1000W B200 GPUs in Q4 2024.
These GPUs will power HGX servers. GB200-based NVL36 and NVL72 servers will also face limited availability. Foxconn is negotiating the release of these servers.
To address the demand for more cost-effective AI systems, Nvidia is developing the B200A GPU.
This model will feature monolithic B102 silicon with 144 GB of HBM3E memory and will use the more established CoWoS-S packaging.
Scheduled for release in Q2 2025, the B200A will offer up to 144 GB of HBM3E memory and 4 TB/s memory bandwidth.
3 Semiconductor Stocks Poised for Trillion-Dollar Valuation by 2030 – techovedas
Conclusion
Foxconn’s announcement highlights the ongoing challenges and complexities in the semiconductor industry, especially with cutting-edge GPU technologies.
Nvidia’s Blackwell GPUs will have limited initial availability. Foxconn is focused on meeting AI server demand. The company is also developing alternative solutions, like the B200A GPU. These steps highlight the industry’s adaptability and resilience.
For further updates on Nvidia’s Blackwell GPUs and Foxconn’s advancements in AI server technology, stay tuned to our tech coverage.