High Bandwidth Memory is the defining investment question for Micron. The company went from zero HBM revenue to over $5 billion per quarter in roughly two years, transforming from a cyclical commodity supplier into an AI infrastructure oligopolist. Only three companies in the world can manufacture HBM -- SK Hynix, Micron, and Samsung -- and all supply is sold out through calendar 2026. The central debate: does this oligopoly structure make HBM structurally different from the boom-bust memory cycles of the past, or are today's unprecedented margins simply the prelude to another capacity-fueled bust?
The memory cycle question: 75-81% margins have NEVER been sustained
Every previous memory supercycle has ended with margin compression. Gross margins of 75-81% are unprecedented in Micron's 45-year history. Samsung has announced aggressive HBM capacity expansion and is expected to recapture share above 30% by 2027. If HBM margins revert to 35-40%, the per-share impact is severe.
The HBM market has exploded from a niche product to the fastest-growing segment in semiconductors. Total HBM TAM grew from roughly $4B in 2023 to $35B in 2025 and is projected to reach $100B by 2028 -- a 40% CAGR that would make HBM alone larger than the entire DRAM market was in 2024. Demand is driven almost entirely by AI accelerator GPUs: NVIDIA's H200 and B200 each require 8 HBM3E stacks (192GB), while the B300 will require even more. AMD's MI300X uses 192GB HBM3, and MI400 will move to HBM3E/HBM4. The market is supply-constrained through at least CY2026, with all three manufacturers (SK Hynix, Micron, Samsung) confirming sold-out status. Bank of America estimates the 2026 HBM market at $54.6B (+58% YoY). The key demand drivers beyond 2026 include inference scaling (which uses even more memory than training per dollar of compute), multi-modal AI models requiring larger context windows, and sovereign AI infrastructure buildouts in Europe, Middle East, and Asia.
HBM TAM projected to grow from ~$35B (CY2025) to $55B (CY2026) to $100B (CY2028), ~40% CAGR. Micron pulled forward $100B milestone by 2 years from prior estimate
The HBM market is a 3-player oligopoly with distinct competitive dynamics. SK Hynix is the dominant leader (57-62% share) with first-mover advantage in HBM3E and preferred supplier status with NVIDIA. Micron holds #2 position (21% share) after overtaking Samsung in Q2 2025, driven by superior HBM3E yields and faster qualification. Samsung (17-22% share) fell to #3 due to HBM3E yield issues and delayed NVIDIA qualification but is aggressively investing to recover share above 30% by 2027. The competitive dynamics shift with HBM4: Samsung's advanced packaging capabilities and massive capex budget ($38B+ in 2025) could enable it to leapfrog in the HBM4 generation. Counterpoint Research forecasts Samsung's position strengthening as HBM4 enters full-scale supply in 2026. The key risk for Micron is being squeezed between SK Hynix's technology leadership and Samsung's scale advantages in the HBM4 era.
Q2 2025 HBM market share: SK Hynix 62%, Micron 21%, Samsung 17%. Micron overtook Samsung as #2 HBM supplier for the first time
The HBM technology roadmap is accelerating, with each generation raising both performance and manufacturing complexity. HBM3E (current generation, shipping since 2024) uses 8-12 high DRAM die stacking with through-silicon vias (TSVs) and delivers 1.2 TB/s bandwidth. HBM4 (ramping CY2026) moves to 16-high stacking, hybrid bonding instead of micro-bumps, and delivers 60%+ bandwidth improvement over HBM3E. HBM4E (expected CY2027-2028) will push to 24+ high stacking. Each generation increases manufacturing complexity exponentially -- thermal management of 16+ stacked dies, alignment precision for hybrid bonding (<1 micron), and yield challenges on advanced packaging. This escalating complexity is the bull case for margin sustainability: the barriers to entry keep rising, preventing the typical memory oversupply cycle. Micron's competitive position rests on its 1-beta DRAM node (industry's most advanced) which provides the thinnest dies for high-stack HBM, and its partnership with various OSATs (outsourced semiconductor assembly and test) for packaging.
HBM4 will ramp in volume in CY2026, providing 60%+ bandwidth increase over HBM3E. Uses 16-high stacking vs HBM3E 12-high, with hybrid bonding replacing micro-bumps
This is the single most important question for Micron's valuation. The bull case argues HBM is structurally different from commodity DRAM: only 3 suppliers can make it, advanced packaging is the bottleneck (not fab capacity which caused historical oversupply), AI demand is multi-year and deepening, and all three suppliers are showing unusual supply discipline. The bear case points to memory's unbroken 45-year track record of cyclicality: every supercycle has ended in oversupply, 75-81% gross margins are unprecedented and will attract aggressive capacity expansion, Samsung is already investing heavily to regain share, and AI efficiency improvements could reduce memory intensity. The stock market appears to lean bear -- Micron fell 4-7% on record Q2 FY2026 results ($23.86B revenue, $12.20 EPS beating by 40%), suggesting the market is pricing peak-cycle risk. Historical precedent: Micron's gross margin peaked at 62% in 2018 before crashing to -9% in 2019. Current 75-81% margins are 20+ points above ANY prior peak. TrendForce reports that Samsung and SK Hynix are being cautious on expansion, potentially extending the rally past 2028 -- but this supply discipline has never survived a full memory cycle.
Micron stock fell 4-7% after reporting record Q2 FY2026 results: $23.86B revenue (+196% YoY), EPS $12.20 (beat consensus $8.73 by 40%), 75% gross margin. Market is pricing peak-cyc...