Foxconn, a leading electronics manufacturer and primary server supplier for Nvidia, announced a significant 29.7% year-over-year increase in first-quarter revenue, attributing the growth to robust demand for artificial intelligence hardware. The company's March revenue alone soared 45.6%, setting a new monthly record. Foxconn anticipates this momentum for AI server-rack systems will continue into the second quarter, providing a tangible signal of sustained infrastructure investment in the AI sector.
Market Seeks Validation Amid Volatility
This update arrives as investors actively seek confirmation that the massive capital expenditure directed toward AI infrastructure is translating into concrete orders, especially following recent market fluctuations and growing scrutiny over potential returns. By the end of March, Nvidia's forward price-to-earnings ratio had declined to its lowest level since 2019. Despite this, analysts maintain expectations for the chipmaker's earnings to grow more than 70% in the current fiscal year. Foxconn concurrently highlighted the "volatile" political and economic landscape as a factor in its operations.
Nvidia's Record Performance and Guidance
Nvidia recently reported record fourth-quarter revenue of $68.1 billion, with data-center sales contributing $62.3 billion. For the full fiscal year, the company's sales reached $215.9 billion. Looking ahead, Nvidia provided guidance for first-quarter revenue of $78 billion, explicitly noting that this figure excludes any sales from data-center compute products in China—a reminder of the ongoing market restrictions there. CEO Jensen Huang remarked, "Enterprise adoption of [AI] agents is skyrocketing," underscoring the broadening application of the technology.
Positive signals extend across Nvidia's supply chain. Samsung is poised for a record profit in the first quarter, driven by soaring memory prices fueled by AI demand. Analysts also note strength in its contract chipmaking unit, bolstered by a new partnership with Nvidia to produce AI inference chips. During the GTC conference in March, Huang revealed that Samsung has already commenced manufacturing Nvidia's Groq LP30 inference processor using its 4-nanometer process, with shipments expected in the second half of the year. Regarding the memory market, Daol Investment & Securities analyst Ko Yeongmin stated succinctly, "You couldn't ask for things to be better."
The Battle for AI Inference Heats Up
AI inference—the phase where a trained model generates responses to user prompts—has become a critical new front in the semiconductor competition. This shift partly explains Nvidia's recent $2 billion strategic investment in Marvell Technology, aimed at simplifying the integration of custom client chips into data centers built around Nvidia's architecture. eMarketer analyst Jacob Bourne suggested this move should help solidify Nvidia's position as a "key access point" for a wider array of AI workloads.
Competitive pressure is mounting from multiple directions. Broadcom expects its AI-chip revenue to exceed $100 billion next year, propelled by surging demand for custom silicon. Meta Platforms, while continuing to place substantial orders with Nvidia and Advanced Micro Devices, has also outlined plans to develop its own training and inference chips.
China Emerges as a Key Competitive Stress Point
The Chinese market represents a significant area of tension. According to a recent report, DeepSeek's upcoming V4 model is slated to use chips from Huawei. Data indicates that Chinese suppliers captured 41% of China's AI accelerator server market in 2025, reducing Nvidia's share to 55%. Within the domestic field, Huawei has pulled ahead of local rivals like Alibaba's T-Head, Baidu's Kunlunxin, and Cambricon.
The competitive landscape could further shift. A bipartisan group in the U.S. Congress is advocating for stricter export controls on chipmaking equipment destined for China, a move that would deepen the fissures in the global semiconductor supply chain. Summit Insights analyst KinNgai Chan observes that Nvidia is "definitely going to see more competition" as inference workloads increasingly migrate to application-specific integrated circuits (ASICs), which are designed for efficient, large-scale execution of singular tasks.
For now, Foxconn's strong financial results offer a clearer gauge of near-term demand than the recent nervousness surrounding Nvidia's stock. However, the more complex, longer-term challenges—including questions about investment returns, the rise of formidable rivals, and the potential for custom chipmakers to capture market share—remain on the horizon for the industry.



