Micron Technology’s stock has increased by nearly 9x in the past year, raising its market valuation above $800 billion – marking one of the most substantial single-year increases in the company’s forty-year trading history. This increase is fueled by heightened demand and a deficit of high-bandwidth memory, the specialized chips that complement AI accelerators within data centers. Talk of a potential labor strike at Samsung has also added volatility. Micron has pre-ordered its entire HBM output until 2026 under binding contracts. Hyperscalers like Microsoft (MSFT), Alphabet (GOOG), and Meta (META) are expected to invest around $700 billion in AI infrastructure this year, with memory being a vital component throughout all phases of this development. The stock may also seem undervalued at just 14x projected FY ’26 earnings and 8x next year’s earnings. Refer to MU valuation metrics. Nevertheless, the memory markets are highly cyclical. So how has the stock historically responded during downturns?
While cloud computing has significantly driven AI demand, the edge may be the next frontier. Discover how Qualcomm could emerge as one of the major beneficiaries.
How Memory Cycles Typically Collapse
The memory sector has experienced several significant price crashes over the last fifteen years, all stemming from the same fundamental issue. Establishing a new DRAM fabrication facility takes two to three years and requires tens of billions of dollars. After construction, the economics tend to favor operating it at full capacity, irrespective of the current price. Demand spikes, prices soar, manufacturers invest in additional capacity, and by the time this new capacity is operational, the market for which it was intended often no longer exists.
The 2022-2023 Memory Crash: The downturn from 2022 to 2023 was among the most severe in monetary terms. Demand diminished post-pandemic. Supplier stockpiles hit 31 weeks by early 2023. Micron reported a GAAP net loss of $2.31 billion within a single quarter, its highest to date; reduced its workforce by 10%; and drastically cut its capital expenditure. The stock dropped approximately 50% from the early 2022 levels, trading at about 11x forward earnings at its peak.
The 2018-2019 Inventory Adjustment: In 2018 and 2019, cloud operators over-purchased memory in 2017, subsequently reducing orders throughout 2018 as stocks grew. NAND prices fell by approximately 60%, and DRAM prices declined around 40%. Micron reached a peak of nearly $64 in May 2018, falling to about $28 by year-end, a decrease of roughly 57%. At the stock’s May 2018 peak of around $64, the forward price-to-earnings ratio based on fiscal year predictions was around 4.5x.
The 2014-2016 DRAM Decline: During the crash from 2014 to 2016, DRAM capacity had expanded in anticipation of PC demand that never materialized as consumers pivoted to mobile devices. Prices consistently fell throughout 2015. Micron’s stock plummeted roughly 70%, from around $37 in late summer 2014 to below $10 by February 2016.
The recurring severity of these historical price collapses highlights why a rule-based investing framework is essential for managing risk in a sector defined by such violent supply-demand swings.
What Sets This AI Cycle Apart?
Three structural elements differentiate the present landscape from previous upturns.
First, demand intensity. The demand for memory per AI system is no longer increasing linearly with deployment but geometrically. Nvidia’s (NVDA) H100 utilized 80GB of HBM, while its Rubin Ultra successor aims for 512GB per GPU module. Expanding model sizes further fuel this demand. Initially, HBM was predominantly used for training large language models (LLMs), but as of 2026, the focus has shifted towards inference, or executing models for end users. Applications like real-time video generation and sophisticated AI agents necessitate the ultra-low latency that exclusively HBM can provide.
Second, contract structure. HBM is increasingly sold through long-term commitments with hyperscalers instead of the spot market, which minimizes sudden order cancellations that have historically caused sharp price reductions. In March, Micron secured the industry’s inaugural five-year HBM supply agreement, encompassing both volume and pricing, which signals a transition towards more transparent, contracted revenue streams.
Third, supply limitations. HBM demands significantly more wafer capacity per bit than standard DRAM, while its production complexity restricts how quickly competitors can ramp up output. This constrained supply environment has allowed established firms like Micron to rapidly gain market share. Micron’s HBM revenue share increased from 9% of the global market in Q4 2024 to 21% in Q4 2025, while the overall HBM market approximately doubled during the same timeframe.
The Risk of Oversupply Remains
None of these factors dismisses the economic realities of semiconductor manufacturing. Micron has projected fiscal 2026 capital expenditures exceeding $25 billion, while SK Hynix is anticipated to spend around KRW 40 trillion, roughly $27 billion, as per S&P. Samsung Electronics is also pursuing aggressive expansion. Historical trends suggest that when all three primary memory producers increase capacity simultaneously, oversupply tends to arise within a two to three-year window.
Simultaneously, the cycle continues to hinge on sustained AI investments by hyperscalers such as Alphabet (GOOG) and Amazon (AMZN). If hyperscalers encounter pressure to deliver increased returns on their substantial capital expenditures, spending could decelerate, impacting memory demand directly.
Read the full article here




