Micron's AI Momentum: Outpacing Nvidia in the Memory Chip Market?

Artificial intelligence (AI) has transformed major industries, including healthcare, finance, retail, automobile, and manufacturing. Nvidia Corporation (NVDA) has been at the forefront of advancing AI through its graphics processing units (GPUs). These GPUs are crucial for training large language models (LLMs) such as OpenAI’s ChatGPT, leading to outstanding growth in the company’s revenue and earnings.

As a result, NVDA’s stock has surged nearly 148% over the past six months and is up more than 205% over the past year. Nvidia stock’s exceptional performance lifted its market capitalization above $3 trillion, making it the second-most valuable company in America.

However, another leading semiconductor company, Micron Technology, Inc. (MU), known for its innovative memory and storage solutions, is also experiencing remarkable growth due to rapid AI adoption.

Let’s explore how the ongoing AI boom powers Micron’s impressive growth and assess if it could outpace Nvidia in the memory chip market.

Micron’s Solid Third-Quarter Financials and Optimistic Outlook

MU posted revenue of $6.81 billion for the third quarter that ended May 30, 2024, surpassing analysts’ expectations of $6.67 billion. That compared to $5.82 billion for the previous quarter and $3.75 billion for the same period last year. Robust AI demand and robust execution enabled Micron to drive exceptional revenue growth, exceeding its guidance range for the third quarter.

Micron’s non-GAAP gross margin was $1.92 billion, compared to $1.16 billion in the prior quarter and negative $603 million in the third quarter of 2023. Its non-GAAP operating income came in at $941 million, versus $204 million in the previous quarter and negative $1.47 billion for the same period of 2023.

Furthermore, the company posted non-GAAP net income and earnings per share of $702 million and $0.62, compared to net loss and loss per share of $1.57 billion and $1.43 in the same quarter last year, respectively. Its EPS surpassed the consensus estimate of $0.53.

MU’s adjusted free cash flow was $425 million, compared to negative $29 million in the previous quarter and negative $1.36 billion for the same quarter of 2023. The company ended the quarter with cash, marketable investments, and restricted cash of $9.22 billion. 

“We are gaining share in high-margin products like High Bandwidth Memory (HBM), and our data center SSD revenue hit a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We are excited about the expanding AI-driven opportunities ahead, and are well positioned to deliver a substantial revenue record in fiscal 2025,” said Sanjay Mehrotra, Micron Technology’s President and CEO.

For the fourth quarter of 2024, Micron expects revenue of $7.60 billion ± $200 million. The midpoint ($7.60 billion) of its revenue guidance range represents an approximately 90% rise from the same period last year. Its non-GAAP gross margin is anticipated to be 34.5% ± 1%. In addition, the company projects its non-GAAP earnings per share to be $1.08 ± 0.08, a turnaround from a loss of $1.07 per share in the previous year’s quarter.

Vital Role in the AI Ecosystem

MU’s success in the AI ecosystem is primarily driven by its high-bandwidth memory (HBM) chips, integral to high-performance computing (HPC), GPUs, AI, and other data-intensive applications. The chips provide fast and efficient memory access for processing large volumes of data quickly.

Micron sold $100 million of its HBM3E chips in the third quarter alone. Further, the company anticipates its HBM3E revenue to escalate from “several hundred million dollars” in fiscal 2024 to “multiple billions” for fiscal 2025.

Earlier this year, the company started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs.

Moreover, Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications. In June, MU sampled its next-gen GDDR7 graphics memory for AI, gaming, and HPC workloads. Leveraging Micron’s 1β (1-beta) DRAM technology and advanced architecture, the GDDR7 delivers 32 Gb/s high-performance memory in a power-optimized design.

On May 1, the company reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the growing demands for rigorous speed and capacity of memory-intensive Gen AI applications. Powered by Micron’s 1β technology, the 128GB DDR5 RDIMM memory offers over 45% greater bit density, up to 22% improved energy efficiency, and up to 16% reduced latency over competitive 3DS through-silicon via (TSV) products.

AI-Driven Demand in Smartphones, PCs, and Data Centers

AI drives strong demand for memory chips across various sectors, including smartphones, personal computers (PCs), and data centers. In its latest earnings conference call, Micron’s management pointed out that AI-enabled PCs are expected to feature 40% to 80% more DRAM content than current PCs and larger storage capacities. Similarly, AI-enabled smartphones this year carry 50% to 100% more DRAM than last year’s flagship models.

These trends suggest a bright future for the global memory chips market. According to the Business Research Company report, the market is expected to reach $130.42 billion by 2028, growing at a CAGR of 6.9%.

Micron’s Competitive Edge Over Nvidia and Attractive Valuation

Despite NVDA’s expected revenue jump from $60.90 billion in the fiscal year 2023 to around $120 billion this year, MU is projected to outpace Nvidia’s growth in the following year. Micron’s revenue could increase by another 50% year-over-year in the next fiscal year, outperforming Nvidia’s forecasted growth of 33.7%.

In terms of non-GAAP P/E (FY2), MU is currently trading at 13.76x, 60.9% lower than NVDA, which is trading at 35.18x. MU’s forward EV/Sales and EV/EBITDA of 5.98x and 16.44x are lower than NVDA’s 26.04x and 40.56x, respectively. Also, MU’s trailing-12-month Price to Book multiple of 3.28 is significantly lower than NVDA’s 64.15.

Thus, Micron is a compelling investment opportunity for those seeking exposure to the AI-driven memory chip market at a more reasonable price.

Bottom Line

MU is experiencing significant growth driven by the AI boom, with impressive third-quarter financials and a strong outlook for upcoming quarters. The company’s strategic positioning in the AI-driven memory chip market, especially its HBM3E chips, is vital for high-performance computing and data-intensive applications. It has enabled Micron to capitalize on the surging AI demand across various sectors, including smartphones, PCs, and data centers.

On June 27, Goldman Sachs’ analyst Toshiya Hari maintained a Buy rating on MU shares and raised the price target to $158 from $138. Goldman Sachs’ stance indicates strong confidence in Micron’s long-term prospects, particularly with the expansion of AI computing capabilities and its strategic initiatives in the memory market.

Moreover, Rosenblatt Securities reiterated its Buy rating on Micron Technology shares, with a steady price target of $225. The firm’s optimism is fueled by expectations of solid financial performance surpassing analysts’ estimates, propelled by advancements in AI and HBM developments.

Compared to Nvidia, Micron offers solid growth potential at a more reasonable valuation. Despite Nvidia’s dominant position in the AI and data center segment and exceptional stock performance, Micron’s revenue growth rate is projected to outpace Nvidia’s in the following year, driven by its expanding AI product portfolio and increasing market share in high-margin memory products.

For investors seeking exposure to the AI revolution, Micron presents a compelling opportunity with its solid financial performance, innovative product offerings, and competitive edge in the memory chip market.

How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.