What We Learned From Earnings So Far

by Brad Gastwirth Global Head of Research and Market Intelligence

Overview: The Q3 2025 earnings season is just beginning, but early disclosures from TSMC and Intel already signal that the AI hardware cycle remains structurally tight. Lead times across advanced nodes, packaging, and high-bandwidth memory remain extended, while pre-earnings channel checks from memory and component suppliers suggest continued firmness heading into year-end.

A plethora of earnings roll out over the next two weeks from Samsung and SK Hynix to Microsoft, Amazon, Meta, and Alphabet (among others).

Foundry and Packaging: Tightness Persists at the Leading Edge

  • TSMC (reported Oct 16) reinforced that AI-related demand now represents roughly one-quarter of wafer starts, up meaningfully from the first half of the year. Despite incremental capacity additions, the company signaled that CoWoS (Chip-on-Wafer-on-Substrate) packaging remains fully booked through at least mid-2026.
  • This aligns with supply-chain checks indicating that advanced interposer and substrate availability are still gating factors for the next wave of GPU and accelerator shipments.
  • Intel (reported Oct 23) offered a more mixed tone: AI server volumes are improving, but enterprise compute remains uneven. Intel noted incremental traction in its Gaudi 3 accelerator line yet acknowledged that meaningful AI revenue contribution is still a 2026 story.
  • Together, these early reports suggest that structural bottlenecks remain unchanged wafer starts, packaging, and substrate capacity continue to define supply ceilings for the broader AI ecosystem.

Memory: Market Still Firm Ahead of Samsung and Hynix Results

  • While Samsung and SK Hynix have yet to release official Q3 results (scheduled Oct 29–30), pricing data and distributor feedback indicate that the memory upcycle is intact.
  • HBM3E and DDR5 demand continues to drive bit growth, with AI servers absorbing the majority of incremental output.
  • NAND pricing for Q4 appears up low-double digits sequentially, supported by lean channel inventories and limited wafer-start growth (estimated ~6 %).
  • Supply remains disciplined, with both Korean majors signaling no near-term greenfield expansions.
  • If those signals hold through earnings, the memory market looks positioned for sustained strength into 2026 under base-case AI server growth scenarios of 25–30 %.

Hyperscalers: Watching for Signs of Pause or Persistence

  • The large cloud operators Microsoft, Amazon, Meta, and Alphabet will report next week, and their capital-spending commentary will set the tone for Q4 and early 2026 component demand.
  • So far, supplier feedback implies that AI infrastructure projects remain on schedule, though deployment sequencing may vary by region. Some North American integrators note minor delays in GPU rack completions tied to interposer and power-component shortages
  • In short, hyperscaler appetite still appears strong, but the rate of incremental expansion rather than the absolute level will determine whether component lead times tighten further or plateau near current levels.

Broader Component Landscape

Outside semiconductors, MLCC, power ICs, and server connectors remain on allocation for AI-grade builds, while commodity compute components continue to track flat. HDD lead times hover around 30–35 weeks for 32 TB units, with pricing stable to firm as enterprise restocking begins slowly.

Key Takeaways

  1. Advanced Packaging Still the Bottleneck: CoWoS and interposer capacity remain fully utilized into 2026.
  2. Memory Market Strength Likely to Extend: Early indicators suggest firmer pricing and tight inventories ahead of official reports.
  3. Non-AI Segments Lag: Consumer and legacy compute markets remain subdued, widening the bifurcation in the supply chain.

Outlook: Early earnings and supply-chain checks point to a disciplined yet capacity-constrained environment. Unless hyperscalers materially slow deployments, pricing across HBM, NAND, and advanced substrates is likely to remain firm through mid-2026.

The overarching message for Circular customers: secure allocation early, diversify sourcing, and plan for elevated lead-time risk across AI-linked components well into next year.