Broadcom (AVGO) and Micron (MU): Top Picks for Data Center Investment Surge

The expected record spending on infrastructure by cloud computing leaders such as Microsoft Corporation (MSFT) and Amazon.com, Inc. (AMZN) this year highlights the escalating investments in artificial intelligence (AI) data centers, a trend likely to benefit chipmakers significantly.

Bank of America (BofA) analysts forecast that cloud service provider capital expenditures will reach $121 billion in the second half of 2024, bringing the total to a record $227 billion in 2024. This figure marks a 39% increase compared to the previous year.

c, Microsoft, and Meta Platforms, Inc. (META) are predicted to more than double their spending compared to 2020 levels, while Oracle Corporation (ORCL) is expected to increase its capital expenditure nearly sixfold. The proportion of this spending allocated to data centers is already around 55% and is anticipated to rise further, reflecting the critical role of data centers in supporting advanced AI applications.

While NVIDIA Corporation (NVDA) stands out as the dominant player in the AI GPU market, BofA analysts have highlighted Broadcom Inc. (AVGO) and Micron Technology, Inc. (MU) as compelling alternatives for investors seeking to benefit from this trend.

In this article, we will delve into why Broadcom and Micron are well-positioned to capitalize on growing investments by cloud service providers in AI data centers, evaluate their financial health and recent performance, and explore the potential headwinds and tailwinds they may encounter in the near future.

Broadcom Inc. (AVGO)

Valued at a $732.45 billion market cap, Broadcom Inc. (AVGO) is a global tech leader that designs, develops, and supplies semiconductor and infrastructure software solutions. Broadcom’s extensive portfolio of semiconductor solutions, including networking chips, storage adapters, and advanced optical components, makes it a critical supplier for data centers.

Moreover, Broadcom’s leadership in networking solutions, exemplified by its Tomahawk and Trident series of Ethernet switches, positions it as a critical beneficiary of increased AI data center spending.

In May, AVGO revolutionized the data center ecosystem with its latest portfolio of highly scalable, high-performing, low-power 400G PCIe Gen 5.0 Ethernet adapters. The latest products provide an improved, open, standards-based Ethernet NIC and switching solution to address connectivity bottlenecks caused by the rapid growth in XPU bandwidth and cluster sizes in AI data centers.

Further, Broadcom’s strategic acquisitions, such as the recent purchase of VMware, Inc., enhance its data center and cloud computing capabilities. With this acquisition, AVGO will bring together its engineering-first, innovation-centric teams as it takes another significant step forward in building the world’s leading infrastructure technology company. 

Broadcom’s solid second-quarter performance was primarily driven by AI demand and VMware. AVGO’s net revenue increased 43% year-over-year to $12.49 billion in the quarter that ended May 5, 2024. That exceeded the consensus revenue estimate of $12.01 billion. Revenue from its AI products hit a record of $3.10 billion for the quarter.

AVGO reported triple-digit revenue growth in the Infrastructure Software segment to $5.29 billion as enterprises increasingly adopted the VMware software stack to build their private clouds. Its gross margin rose 27.2% year-over-year to $7.78 billion. Its non-GAAP operating income grew 32% from the year-ago value to $7.15 billion. Its adjusted EBITDA was $7.43 billion, up 30.6% year-over-year.

Further, the company’s non-GAAP net income was $5.39 billion or $10.96 per share, up 20.2% and 6.2% from the prior year’s quarter, respectively. Cash from operations of $4.58 billion for the quarter, less capital expenditures of $132 million, resulted in free cash flow of $4.45 billion, or 36% of revenue.

When it posted solid earnings for its second quarter, Broadcom announced a ten-for-one stock split, which took effect on July 12, making stock ownership more affordable and accessible to investors.

Moreover, AVGO raised its fiscal year 2024 guidance. The tech company expects full-year revenue of nearly $51 billion. Broadcom anticipates $10 billion in revenue from chips related to AI this year. Its adjusted EBITDA is expected to be approximately 61% of projected revenue.

Analysts expect AVGO’s revenue for the third quarter (ending July 2024) to grow 45.9% year-over-year to $12.95 billion. The consensus EPS estimate of $1.20 for the ongoing quarter indicates a 14% year-over-year increase. Also, the company has surpassed the consensus revenue and EPS estimates in each of the trailing four quarters.

In addition, the company’s revenue and EPS for the fiscal year ending October 2024 are expected to increase 43.6% and 12.4% from the previous year to $51.44 billion and $4.75, respectively.

AVGO’s shares have gained more than 29% over the past six months and around 74% over the past year. Moreover, the stock is up nearly 40% year-to-date.

Micron Technology, Inc. (MU)

Another chipmaker that is well-poised to benefit from significant data center spending among enterprises is Micron Technology, Inc. (MU). With a $126.70 billion market cap, MU provides cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s role as a leading provider of DRAM and NAND flash memory positions it to capitalize on the surging demand for high-performance memory solutions. The need for advanced memory products grows as data centers expand to support AI and machine learning workloads. The company’s innovation in memory technologies, such as the HBM2E, aligns well with the performance requirements of modern data centers.

Also, recently, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads. Notably, Micron reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Further, MU’s strategic partnerships with leading tech companies like Nvidia and Intel Corporation (INTC) position the chipmaker at the forefront of technology advancements. In February, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

For the third quarter, which ended May 30, 2024, MU posted revenue of $6.81 billion, surpassing analysts’ expectations of $6.67 billion. That compared to $5.82 billion in the prior quarter and $3.75 billion for the same period last year. Moreover, AI demand drove 50% sequential data center revenue growth and record-high data center revenue mix.

MU’s non-GAAP gross margin was $1.92 billion, versus $1.16 million in the prior quarter and negative $603 million for the previous year’s quarter. Its non-GAAP operating income came in at $941 million, compared to $204 million in the prior quarter and negative $1.47 billion for the same period in 2023.

Additionally, the chip company reported non-GAAP net income and earnings per share of $702 million and $0.62 for the third quarter, compared to non-GAAP net loss and loss per share of $1.57 billion and $1.43 a year ago, respectively. Its EPS beat the consensus estimate of $0.53. Its adjusted free cash flow was $425 million during the quarter, compared to a negative $1.36 billion in the prior year’s quarter.

For the fourth quarter of fiscal 2024, Micron expects non-GAAP revenue of $7.60 million ± $200 million, and its gross margin is anticipated to be 34.5% ± 1%. Also, the company expects its non-GAAP earnings per share to be $1.08 ± 0.08.

Analysts expect AVGO’s revenue for the fourth quarter (ending August 2024) to increase 91.4% year-over-year to $7.68 billion. The company is expected to report an EPS of $1.14 for the current quarter, compared to a loss per share of $1.07 in the prior year’s quarter. Further, the company has surpassed the consensus revenue and EPS estimates in each of the trailing four quarters.

MU’s shares have surged over 30% over the past six months and approximately 75% over the past year.

Bottom Line

The substantial surge in capital expenditures by cloud computing giants like Microsoft, Amazon, and Alphabet highlights the importance of AI and data centers in the tech industry’s landscape. Broadcom and Micron emerge as two of the most promising chip stocks for investors seeking to benefit from this trend. Both companies offer solid financial health, significant market positions, and exposure to the expanding data center and AI markets.

While Broadcom’s diverse semiconductor solutions and Micron’s leadership in memory technology make them attractive investment opportunities, investors must remain mindful of potential headwinds, including market competition and geopolitical risks. By evaluating these factors and understanding the growth potential of these companies, investors can make informed decisions in the rapidly evolving technology sector.

Is Intel a Buy? Deep Dive into Software Expansion and AI Aspirations

Intel Corporation (INTC), a global leader in designing and manufacturing semiconductor products, is making headlines with its ambitious goals for software expansion. Chief Technology Officer (CTO) Greg Lavender told Reuters that Intel’s push into software is progressing well, with the company potentially achieving cumulative software revenue of $1 billion by the end of 2027.

Progress in Building a Software Business

INTC has been steadily growing its software capabilities. The company generated over $100 million in software revenue in 2021, the year Greg Lavender was brought in from cloud computing firm VMware, Inc. (VMW) by CEO Pat Gelsinger to lead Intel’s software strategy. Since then, the chipmaker has acquired three software companies. It highlights Intel’s strategic pivot towards becoming a significant player in the software market, complementing its traditional hardware dominance.

Intel, which reported $54 billion in revenue in 2023, offers a variety of software services and tools, ranging from cloud computing to artificial intelligence (AI). Lavender stated that his strategy is centered on providing services in AI, performance, and security, with the company making significant investments in all three areas.

The chipmaker's investment in AI is particularly noteworthy. INTC’s upcoming Gaudi 3 chip is expected to generate significant demand, potentially positioning the company as a major contender in the AI chip market. Intel said it expected over $500 million in sales from its Gaudi 3 chips in the second half of the year.

Powered by the high-efficiency Intel® Gaudi® platform and boasting proven MLPerf benchmark performance, Intel® Gaudi® 3 AI accelerators are designed to tackle demanding training and inference tasks. Recently, Intel announced pricing for Intel® Gaudi® 2 and Intel® Gaudi® 3 AI accelerator kits, which redefine power, performance, and affordability.

A standard AI kit, including Intel Gaudi 2 accelerators with a universal baseboard (UBB), is offered to system providers at $65,000, estimated to be one-third the cost of comparable competitive platforms. Also, a kit including eight Intel Gaudi 3 accelerators with a UBB will cost $125,000, expected to be two-thirds the cost of comparable competitive platforms.

NVIDIA Corporation (NVDA) currently dominates this space, controlling about 83% of the data center chip market in 2023. However, INTC’s focus on developing versatile and efficient AI processors could challenge NVDA’s dominance.

Positioning as a Leader in the Tech Industry

Intel’s comprehensive approach to AI software development could significantly enhance its position in the technology industry. CTO Greg Lavender mentioned that Intel is backing open-source initiatives to create software and tools capable of powering a diverse array of AI chips, with further breakthroughs anticipated in the upcoming months.

A crucial part of NVDA’s success is attributed to its proprietary software, CUDA, which binds developers to Nvidia chips. However, France’s antitrust regulator is preparing to charge Nvidia with suspected anti-competitive practices. The regulatory body voiced concerns about the generative AI sector’s reliance on CUDA.

Intel is a part of the UXL Foundation, a consortium of technology companies working on an open-source project that aims to make computer code run on any machine, regardless of the underlying chip and hardware. Other notable members of this consortium include Qualcomm Inc (QCOM), Samsung Electronics, and Arm Holdings plc (ARM).

Furthermore, INTC is actively contributing to Triton, an initiative led by OpenAI to develop an open-source programming language designed to improve code efficiency across AI chips. This project is also supported by Advanced Micro Devices, Inc. (AMD) and Meta Platforms, Inc. (META). Triton is already operational on Intel’s existing graphics processing units and will be compatible with the company's next generation of AI chips.

“Triton is going to level the playing field,” Lavender said, emphasizing the potential impact of this initiative.

By contributing to open-source projects like Triton and the UXL Foundation, Intel aims to create a more inclusive and competitive AI ecosystem. This strategy boosts INTC’s technological capabilities and strengthens its reputation as a forward-thinking company willing to invest in the broader tech community.

Robust First-Quarter Performance but Weak Second-Quarter Forecast

For the first quarter that ended March 30, 2024, INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Revenue from the company’s biggest business, Client Computing Group (CCG), which is responsible for chips for PCs and laptops, grew 31% year-over-year to $7.50 billion.

Intel’s Data Center and AI business, which makes central processors for servers and other parts and software, reported sales of $3 billion, up 5% year-over-year. The company continues to compete for server market share against well-established chipmakers like Nvidia.

Further, the company’s gross margin rose 30.2% from the prior year’s quarter to $5.22 billion. INTC’s non-GAAP operating income came in at $723 million, compared to an operating loss of $294 million in the previous year’s quarter. Its non-GAAP net income and earnings per share were $759 million and $0.18, compared to a net loss and loss per share of $169 million and $0.04, respectively, in the same period of 2023.

The chipmaker gave weak guidance for the second quarter. For the quarter that ended June 2024, Intel expects its revenue to come between $12.50 billion and $13.50 billion, and its non-GAAP earnings per share is anticipated to be $0.10.

Meanwhile, analysts expect INTC’s revenue for the second quarter to increase marginally year-over-year to $12.99 billion. The company’s EPS is expected to decline 21.6% year-over-year to $0.10 for the same period.

Bottom Line

Intel’s strategic shift towards expanding its software capabilities, primarily focusing on AI and cybersecurity, is setting the stage for substantial future revenue growth. The company’s progress in building a robust software business, evidenced by the significant revenue surge and strategic acquisitions over the years, highlights a promising growth trajectory.

By focusing on AI, performance, and security areas and making significant investments, Intel is diversifying its revenue streams and positioning itself as a formidable player in the tech industry. The company’s executives hinted at robust demand for its upcoming Gaudi 3 chip, which can help Intel take second place in the AI chip market.

While INTC’s involvement in open-source initiatives like Triton and the UXL Foundation, collaboration with industry leaders, and continuous innovation underscores its commitment to fostering a competitive and inclusive AI ecosystem, Nvidia’s dominance in the data center chip market is pronounced and presents a significant challenge.

Intel’s solid first-quarter performance reflects the effectiveness of its strategic initiatives, but its dim second-quarter guidance indicates some short-term challenges. Analysts predict a slight year-over-year revenue increase but a notable EPS decline for the second quarter. While it may face hurdles in the immediate future, INTC’s long-term prospects appear promising, driven by its software expansion and strategic investments in AI.

Cantor Fitzgerald reiterated a Neutral rating on INTC stock while maintaining a price target of $40. Also, TD Cowen reiterated coverage on Intel with a Neutral rating and set a new price target of $40 from $45 previously. Given this backdrop, it seems wise to wait for a better entry point in INTC now.

Nvidia’s GPUs a Game-Changer for Investors?

NVIDIA Corporation (NVDA), a tech giant advancing AI through its cutting-edge graphics processing units (GPUs), became the third U.S. company to exceed a staggering market capitalization of $3 trillion in June, after Microsoft Corporation (MSFT) and Apple Inc. (AAPL). This significant milestone marks nearly a doubling of its value since the start of the year. Nvidia’s stock has surged more than 159% year-to-date and around 176% over the past year.

What drives the company’s exceptional growth, and how do Nvidia GPUs translate into significant financial benefits for cloud providers and investors? This piece will explore the financial implications of investing in NVIDIA GPUs, the impressive ROI metrics for cloud providers, and the company’s growth prospects in the AI GPU market.

Financial Benefits of NVDA’s GPUs for Cloud Providers

During the Bank of America Securities 2024 Global Technology Conference, Ian Buck, Vice President and General Manager of NVDA’s hyperscale and HPC business, highlighted the substantial financial benefits for cloud providers by investing in NVIDIA GPUs.

Buck illustrated that for every dollar spent on NVIDIA GPUs, cloud providers can generate five dollars over four years. This return on investment (ROI) becomes even more impressive for inferencing tasks, where the profitability rises to seven dollars per dollar invested over the same period, with this figure continuing to increase.

This compelling ROI is driven by the superior performance and efficiency of Nvidia’s GPUs, which enable cloud providers to offer enhanced services and handle more complex workloads, particularly in the realm of AI. As AI applications expand across various industries, the demand for high-performance inference solutions escalates, further boosting cloud providers’ financial benefits utilizing NVIDIA’s technology.

NVDA’s Progress in AI and GPU Innovations

NVIDIA’s commitment to addressing the surging demand for AI inference is evident in its continuous innovation and product development. The company introduced cutting-edge products like NVIDIA Inference Microservices (NIMs), designed to support popular AI models such as Llama, Mistral, and Gemma.

These optimized inference microservices for deploying AI models at scale facilitate seamless integration of AI capabilities into cloud infrastructures, enhancing efficiency and scalability for cloud providers.

In addition to NIMs, NVDA is also focusing on its new Blackwell GPU, engineered particularly for inference tasks and energy efficiency. The upcoming Blackwell model is expected to ship to customers later this year. While there may be initial shortages, Nvidia remains optimistic. Buck noted that each new technology phase brings supply and demand challenges, as they experienced with the Hopper GPU.

Furthermore, the early collaboration with cloud providers on the forthcoming Rubin GPU, slated for a 2026 release, underscores the company’s strategic foresight in aligning its innovations with industry requirements.

Nvidia’s GPUs Boost its Stock Value and Earnings

The financial returns of investing in Nvidia GPUs benefit cloud providers considerably and have significant implications for NVDA’s stock value and earnings. With a $4 trillion market cap within sight, the chip giant’s trajectory suggests continued growth and potential for substantial returns for investors.

NVDA’s first-quarter 2025 earnings topped analysts’ expectations and exceeded the high bar set by investors, as Data Center sales rose to a record high amid booming AI demand. For the quarter that ended April 28, 2024, the company posted a record revenue of $26 billion, up 262% year-over-year. That compared to the consensus revenue estimate of $24.56 billion.

The chip giant’s quarterly Data Center revenue was $22.60 billion, an increase of 427% from the prior year’s quarter. Its non-GAAP operating income rose 492% year-over-year to $18.06 billion. NVIDIA’s non-GAAP net income grew 462% from the prior year’s quarter to $15.24 billion. In addition, its non-GAAP EPS came in at $6.12, up 461% year-over-year.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI. Spectrum-X opens a brand-new market for us to bring large-scale AI to Ethernet-only data centers. And NVIDIA NIM is our new software offering that delivers enterprise-grade, optimized generative AI to run on CUDA everywhere — from the cloud to on-prem data centers and RTX AI PCs — through our expansive network of ecosystem partners,” Huang added.

According to its outlook for the second quarter of fiscal 2025, Nvidia’s revenue is anticipated to be $28 billion, plus or minus 2%. The company expects its non-GAAP gross margins to be 75.5%. For the full year, gross margins are projected to be in the mid-70% range.

Analysts also appear highly bullish about the company’s upcoming earnings. NVDA’s revenue and EPS for the second quarter (ending July 2024) are expected to grow 110.5% and 135.5% year-over-year to $28.43 billion and $0.64, respectively. For the fiscal year ending January 2025, Street expects the chip company’s revenue and EPS to increase 97.3% and 111.1% year-over-year to $120.18 billion and $2.74, respectively.

Robust Future Growth in the AI Data Center Market

The exponential growth of AI use cases and applications across various sectors—ranging from healthcare and automobile to retail and manufacturing—highlights the critical role of GPUs in enabling these advancements. NVIDIA’s strategic investments in AI and GPU technology and its emphasis on collaboration with cloud providers position the company at the forefront of this burgeoning AI market.

As Nvidia’s high-end server GPUs are essential for training and deploying large AI models, tech giants like Microsoft and Meta Platforms, Inc. (META) have spent billions of dollars buying these chips. Meta CEO Mark Zuckerberg stated his company is “building an absolutely massive amount of infrastructure” that will include 350,000 H100 GPU graphics cards to be delivered by NVDA by the end of 2024.

NVIDIA’s GPUs are sought after by several other tech companies for superior performance, including Amazon, Microsoft Corporation (MSFT), Alphabet Inc. (GOOGL), and Tesla, Inc. (TSLA).

Notably, NVDA owns a 92% market share in data center GPUs. Led by Nvidia, U.S. tech companies dominate the burgeoning market for generative AI, with market shares of 70% to over 90% in chips and cloud services.

According to the Markets and Markets report, the data center GPU market is projected to value more than $63 billion by 2028, growing at an impressive CAGR of 34.6% during the forecast period (2024-2028). The rapidly rising adoption of data center GPUs across cloud providers should bode well for Nvidia.

Bottom Line

NVDA’s GPUs represent a game-changer for both cloud providers and investors, driven by superior performance and a compelling return on investment (ROI). The attractive financial benefits of investing in NVIDIA GPUs underscore their value, with cloud providers generating substantial profits from enhanced AI capabilities. This high ROI, particularly in AI inferencing tasks, positions Nvidia as a pivotal player in the burgeoning AI data center market, reinforcing its dominant market share and driving continued growth.

Moreover, Wall Street analysts remain bullish about this AI chipmaker’s prospects. TD Cowen analyst Matthew Ramsay increased his price target on NVDA stock from $140 to $165, while maintaining the Buy rating. “One thing remains the same: fundamental strength at Nvidia,” Ramsay said in a client note. “In fact, our checks continue to point to upside in data center (sales) as demand for Hopper/Blackwell-based AI systems continues to exceed supply.”

“Overall we see a product roadmap indicating a relentless pace of innovation across all aspects of the AI compute stack,” Ramsay added.

Meanwhile, KeyBanc Capital Markets analyst John Vinh reiterated his Overweight rating on NVIDIA stock with a price target of $180. “We expect Nvidia to deliver higher results and higher guidance” with its second-quarter 2025 report, Vinh said in a client note. He added solid demand for generative AI will drive the upside.

As AI applications expand across various key industries, NVIDIA’s continuous strategic innovations and product developments, such as the Blackwell GPU and NVIDIA Inference Microservices, ensure the company remains at the forefront of technological advancement. With a market cap nearing $4 trillion and a solid financial outlook, NVIDIA is well-poised to deliver substantial returns for investors, solidifying its standing as a leader in the AI and GPU technology sectors.

Micron's AI Momentum: Outpacing Nvidia in the Memory Chip Market?

Artificial intelligence (AI) has transformed major industries, including healthcare, finance, retail, automobile, and manufacturing. Nvidia Corporation (NVDA) has been at the forefront of advancing AI through its graphics processing units (GPUs). These GPUs are crucial for training large language models (LLMs) such as OpenAI’s ChatGPT, leading to outstanding growth in the company’s revenue and earnings.

As a result, NVDA’s stock has surged nearly 148% over the past six months and is up more than 205% over the past year. Nvidia stock’s exceptional performance lifted its market capitalization above $3 trillion, making it the second-most valuable company in America.

However, another leading semiconductor company, Micron Technology, Inc. (MU), known for its innovative memory and storage solutions, is also experiencing remarkable growth due to rapid AI adoption.

Let’s explore how the ongoing AI boom powers Micron’s impressive growth and assess if it could outpace Nvidia in the memory chip market.

Micron’s Solid Third-Quarter Financials and Optimistic Outlook

MU posted revenue of $6.81 billion for the third quarter that ended May 30, 2024, surpassing analysts’ expectations of $6.67 billion. That compared to $5.82 billion for the previous quarter and $3.75 billion for the same period last year. Robust AI demand and robust execution enabled Micron to drive exceptional revenue growth, exceeding its guidance range for the third quarter.

Micron’s non-GAAP gross margin was $1.92 billion, compared to $1.16 billion in the prior quarter and negative $603 million in the third quarter of 2023. Its non-GAAP operating income came in at $941 million, versus $204 million in the previous quarter and negative $1.47 billion for the same period of 2023.

Furthermore, the company posted non-GAAP net income and earnings per share of $702 million and $0.62, compared to net loss and loss per share of $1.57 billion and $1.43 in the same quarter last year, respectively. Its EPS surpassed the consensus estimate of $0.53.

MU’s adjusted free cash flow was $425 million, compared to negative $29 million in the previous quarter and negative $1.36 billion for the same quarter of 2023. The company ended the quarter with cash, marketable investments, and restricted cash of $9.22 billion. 

“We are gaining share in high-margin products like High Bandwidth Memory (HBM), and our data center SSD revenue hit a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We are excited about the expanding AI-driven opportunities ahead, and are well positioned to deliver a substantial revenue record in fiscal 2025,” said Sanjay Mehrotra, Micron Technology’s President and CEO.

For the fourth quarter of 2024, Micron expects revenue of $7.60 billion ± $200 million. The midpoint ($7.60 billion) of its revenue guidance range represents an approximately 90% rise from the same period last year. Its non-GAAP gross margin is anticipated to be 34.5% ± 1%. In addition, the company projects its non-GAAP earnings per share to be $1.08 ± 0.08, a turnaround from a loss of $1.07 per share in the previous year’s quarter.

Vital Role in the AI Ecosystem

MU’s success in the AI ecosystem is primarily driven by its high-bandwidth memory (HBM) chips, integral to high-performance computing (HPC), GPUs, AI, and other data-intensive applications. The chips provide fast and efficient memory access for processing large volumes of data quickly.

Micron sold $100 million of its HBM3E chips in the third quarter alone. Further, the company anticipates its HBM3E revenue to escalate from “several hundred million dollars” in fiscal 2024 to “multiple billions” for fiscal 2025.

Earlier this year, the company started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs.

Moreover, Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications. In June, MU sampled its next-gen GDDR7 graphics memory for AI, gaming, and HPC workloads. Leveraging Micron’s 1β (1-beta) DRAM technology and advanced architecture, the GDDR7 delivers 32 Gb/s high-performance memory in a power-optimized design.

On May 1, the company reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the growing demands for rigorous speed and capacity of memory-intensive Gen AI applications. Powered by Micron’s 1β technology, the 128GB DDR5 RDIMM memory offers over 45% greater bit density, up to 22% improved energy efficiency, and up to 16% reduced latency over competitive 3DS through-silicon via (TSV) products.

AI-Driven Demand in Smartphones, PCs, and Data Centers

AI drives strong demand for memory chips across various sectors, including smartphones, personal computers (PCs), and data centers. In its latest earnings conference call, Micron’s management pointed out that AI-enabled PCs are expected to feature 40% to 80% more DRAM content than current PCs and larger storage capacities. Similarly, AI-enabled smartphones this year carry 50% to 100% more DRAM than last year’s flagship models.

These trends suggest a bright future for the global memory chips market. According to the Business Research Company report, the market is expected to reach $130.42 billion by 2028, growing at a CAGR of 6.9%.

Micron’s Competitive Edge Over Nvidia and Attractive Valuation

Despite NVDA’s expected revenue jump from $60.90 billion in the fiscal year 2023 to around $120 billion this year, MU is projected to outpace Nvidia’s growth in the following year. Micron’s revenue could increase by another 50% year-over-year in the next fiscal year, outperforming Nvidia’s forecasted growth of 33.7%.

In terms of non-GAAP P/E (FY2), MU is currently trading at 13.76x, 60.9% lower than NVDA, which is trading at 35.18x. MU’s forward EV/Sales and EV/EBITDA of 5.98x and 16.44x are lower than NVDA’s 26.04x and 40.56x, respectively. Also, MU’s trailing-12-month Price to Book multiple of 3.28 is significantly lower than NVDA’s 64.15.

Thus, Micron is a compelling investment opportunity for those seeking exposure to the AI-driven memory chip market at a more reasonable price.

Bottom Line

MU is experiencing significant growth driven by the AI boom, with impressive third-quarter financials and a strong outlook for upcoming quarters. The company’s strategic positioning in the AI-driven memory chip market, especially its HBM3E chips, is vital for high-performance computing and data-intensive applications. It has enabled Micron to capitalize on the surging AI demand across various sectors, including smartphones, PCs, and data centers.

On June 27, Goldman Sachs’ analyst Toshiya Hari maintained a Buy rating on MU shares and raised the price target to $158 from $138. Goldman Sachs’ stance indicates strong confidence in Micron’s long-term prospects, particularly with the expansion of AI computing capabilities and its strategic initiatives in the memory market.

Moreover, Rosenblatt Securities reiterated its Buy rating on Micron Technology shares, with a steady price target of $225. The firm’s optimism is fueled by expectations of solid financial performance surpassing analysts’ estimates, propelled by advancements in AI and HBM developments.

Compared to Nvidia, Micron offers solid growth potential at a more reasonable valuation. Despite Nvidia’s dominant position in the AI and data center segment and exceptional stock performance, Micron’s revenue growth rate is projected to outpace Nvidia’s in the following year, driven by its expanding AI product portfolio and increasing market share in high-margin memory products.

For investors seeking exposure to the AI revolution, Micron presents a compelling opportunity with its solid financial performance, innovative product offerings, and competitive edge in the memory chip market.

Intel's $8.5 Billion Gamble: Can It Rival Nvidia?

Intel Corporation (INTC), a leading player in the semiconductor industry, is making headlines with its ambitious plans to transform its operations, spurred by a substantial $8.5 billion boost from the CHIPS and Science Act. The roughly $280 billion legislative package, signed into law by President Joe Biden in 2022, aims to bolster U.S. semiconductor manufacturing and research and development (R&D) capabilities.

CHIPS Act funding will help advance Intel’s commercial semiconductor projects at key sites in Arizona, New Mexico, Ohio, and Oregon. Also, the company expects to benefit from a U.S. Treasury Department Investment Tax Credit (ITC) of up to 25% on over $100 billion in qualified investments and eligibility for federal loans up to $11 billion.

Previously, CHIPS Act funding and INTC announced plans to invest more than $1100 billion in the U.S. over five years to expand chipmaking capacity critical to national security and the advancement of cutting-edge technologies, including artificial intelligence (AI).

Notably, Intel is the sole American company that both designs and manufactures leading-edge logic chips. Its strategy focuses on three pillars: achieving process technology leadership, constructing a more resilient and sustainable global semiconductor supply chain, and developing a world-class foundry business. These goals align with the CHIPS Act’s objectives to restore manufacturing and technological leadership to the U.S.

The federal funding represents a pivotal opportunity for INTC to reclaim its position as a chip manufacturing powerhouse, potentially rivaling giants like NVIDIA Corporation (NVDA) and Advanced Micro Devices, Inc. (AMD).

Intel’s Strategic Initiatives to Capitalize on AI Boom

At Computex 2024, INTC introduced cutting-edge technologies and architectures that are well-poised to significantly accelerate the AI ecosystem, from the data center, cloud, and network to the edge and PC.

The company launched Intel® Xeon® 6 processors with E-core (Efficient-core) and P-core (Performance-core) SKUs, delivering enhanced performance and power efficiency for high-density, scale-out workloads in the data center. The first of the Xeon 6 processors debuted is the Intel Xeon 6 E-core (code-named Sierra Forest), available beginning June 4. Further, Xeon 6 P-cores (code-named Granite Rapids) are expected to launch next quarter.

Beyond the data center, Intel is expanding its AI footprint in edge computing and PCs. With over 90,000 edge deployments and 200 million CPUs distributed across the ecosystem, the company has consistently enabled enterprise choice for many years. INTC revealed the architectural details of Lunar Lake, the flagship processor for the next generation of AI PCs.

Lunar Lake is set to make a significant leap in graphics and AI processing capabilities, emphasizing power-efficient compute performance tailored for the thin-and-light segment. It promises up to a 40% reduction in System-on-Chip (SoC) power3 and over three times the AI compute8. It is scheduled for release in the third quarter of 2024, in time for the holiday shopping season.

Also, Intel unveiled pricing for Intel® Gaudi® 2 and Intel® Gaudi® 3 AI accelerator kits, providing high performance at up to one-third lower cost compared to competitive platforms. A standard AI kit, including Intel Gaudi 2 accelerators with a UBB, is offered to system providers at $65,000. Integrating Xeon processors with Gaudi AI accelerators in a system presents a robust solution to make AI faster, cheaper, and more accessible.

Intel CEO Pat Gelsinger said, “Intel is one of the only companies in the world innovating across the full spectrum of the AI market opportunity – from semiconductor manufacturing to PC, network, edge and data center systems. Our latest Xeon, Gaudi and Core Ultra platforms, combined with the power of our hardware and software ecosystem, are delivering the flexible, secure, sustainable and cost-effective solutions our customers need to maximize the immense opportunities ahead.”

On May 1, INTC achieved a significant milestone of surpassing 500 AI models running optimized on new Intel® Core™ Ultra processors due to the company’s investment in client AI, the AI PC transformation, framework optimizations, and AI tools like OpenVINO™ toolkit. These processors are the industry’s leading AI PC processors, offering enhanced AI experiences, immersive graphics, and optimized battery life.

Solid First-Quarter Performance and Second-Quarter Guidance

During the first quarter that ended March 30, 2024, INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Revenue from the Client Computing Group (CCG), through which Intel continues to advance its mission to bring AI everywhere, rose 31% year-over-year to $7.50 billion.

Furthermore, the company’s non-GAAP operating income was $723 million, compared to an operating loss of $294 million in the previous year’s quarter. Its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18, compared to a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter of 2023.

For the second quarter of fiscal 2024, Intel expects its revenue to come between $12.5 billion and $13.5 billion, and its non-GAAP earnings per share is expected to be $0.10.

Despite its outstanding financial performance and ambitious plans, INTC’s stock has plunged more than 38% over the past six months and nearly 40% year-to-date.

Competing with Nvidia: A Daunting Task

Despite INTC’s solid financial health and strategic moves, the competition with NVDA is fierce. Nvidia’s market performance has been stellar lately, driven by its global leadership in graphics processing units (GPUs) and its foray into AI and machine learning markets. The chip giant has built strong brand loyalty among developers and enterprise customers, which could be challenging for Intel to overcome.

Over the past year, NVIDIA has experienced a significant surge in sales due to high demand from tech giants such as c, Alphabet Inc. (GOOGL), Microsoft Corporation (MSFT), Meta Platforms, Inc. (META), and OpenAI, who invested billions of dollars in its advanced GPUs essential for developing and deploying AI applications.

Shares of the prominent chipmaker surged approximately 150% over the past six months and more than 196% over the past year. Moreover, NVDA’s stock is up around 2,938% over the past five years. Notably, after Amazon and Google, Nvidia recently became the third U.S. company with a market value surpassing $3 trillion.

As a result, NVDA commands a dominant market share of about 92% in the data center GPU market. Nvidia’s success stems from its cutting-edge semiconductor performance and software prowess. The CUDA development platform, launched in 2006, has emerged as a pivotal tool for AI development, with a user base exceeding 4 million developers.

Bottom Line

Proposed funding of $8.5 billion, along with an investment tax credit and eligibility for CHIPS Act loans, are pivotal in Intel’s bid to regain semiconductor leadership in the face of intense competition, particularly from Nvidia. This substantial federal funding will enhance Intel’s manufacturing and R&D capabilities across its key sites in Arizona, New Mexico, Ohio, and Oregon.

While INTC possesses the resources, technological expertise, and strategic vision to challenge NVDA, the path forward is fraught with challenges. Despite Intel’s recent strides in the AI ecosystem, from the data center to edge and PC with products like Xeon 6 processors and Gaudi AI accelerators, Nvidia’s dominance in data center GPUs remains pronounced, commanding a significant market share.

Future success will depend on Intel’s ability to leverage its strengths in manufacturing, introducing innovative product lines, and cultivating a compelling ecosystem of software and developer support. As Intel advances its ambitious plans, industry experts and stakeholders will keenly watch how these developments unfold, redefining the competitive landscape in the AI and data center markets.