How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.

The Future of NVIDIA: Post-Split Valuation and Growth Projections

NVIDIA Corporation (NVDA), a prominent force in the AI and semiconductor technology industries, announced a 10-for-1 forward stock split of the company’s issued common stock during its last earnings release in May. Shareholders of record as of June 6 received nine additional shares for each share held after the close on Friday, June 7. Trading will commence on a split-adjusted basis at market open on June 10.

This strategic move is poised to reshape the landscape for Nvidia investors and the broader tech market.

Post-Split Valuation

NVDA was already a leading AI stock in the market, but investor interest in the chipmaker skyrocketed as its 10-for-1 stock split took effect after the market’s close on June 7. Shares of the hottest stock on the S&P 500 surged tenfold on Friday following its much-anticipated stock split.

Moreover, NVIDIA’s stock has gained more than 158% over the past six months and nearly 222% over the past year. Notably, the stock is up over 3,222% over the past five years. During this remarkable run, Nvidia’s market cap of around $3 trillion surpassed those of Amazon (AMZN) and Alphabet Inc. (GOOGL). Before the 10-for-1 split, the stock traded at a lofty $1,209.

The chip giant’s strategic decision to split its stock follows a broader trend among tech giants to make their stock ownership more affordable and appealing to retail investors. With more individual investors gaining access to Nvidia’s shares post-split, increased trading activity and demand are observed, potentially driving share prices higher.

According to data from BofA research, total returns for companies announcing stock splits are about 25% in the 12 months after a stock split historically versus 12% gains for the S&P 500. Thus, stock splits are seen as a bullish signal, often accompanied by positive investor sentiment and increased buying activity.

Solid Earnings And A Healthy Outlook

The stock split isn’t the only reason for NVDA’s latest bull run. The company also reported better-than-expected revenue and earnings in the fiscal 2025 first quarter, driven by robust demand for its AI chips. During the quarter that ended April 28, 2024, Nvidia’s revenue rose 262% year-over-year to $26.04 billion. That surpassed the consensus revenue estimate of $24.59 billion.

The company’s largest business segment, Data Center, which includes its AI chips and several additional parts required to run big AI servers, reported a record revenue of $22.60 billion, up 427% year-over-year.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, founder and CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI,” Huang added. During a call with analysts, the CEO mentioned that there would be significant Blackwell revenue this year and that the new chip would be deployed in data centers by the fourth quarter.

The chipmaker’s non-GAAP gross profit grew 328.2% from the previous year’s quarter to $20.56 billion. NVDA’s non-GAAP operating income was $18.06 billion, an increase of 491.7% year-over-year. Its non-GAAP net income rose 461.7% year-over-year to $15.24 billion. Also, it posted a non-GAAP EPS of $6.12, compared to analysts’ estimate of $5.58, and up 461.5% year-over-year.

Furthermore, NVIDIA’s cash, cash equivalents and marketable securities were $31.44 billion as of April 28, 2024, compared to $25.98 billion as of January 28, 2024.

According to its outlook for the second quarter of 2025, the company expects revenue to be $28 billion, plus or minus 2%. Its non-GAAP gross margin is expected to be 75.5%, plus or minus 50 basis points. NVDA’s non-GAAP operating expenses are anticipated to be approximately $2.8 billion.

Raised Dividends

NVDA raised its dividend payouts to reward shareholders and demonstrate confidence in its financial strength and growth prospects. The company increased its quarterly cash dividend by 150% from $0.04 per share to $0.10 per share of common stock. The dividend is equivalent to $0.01 per share on a post-split basis and will be paid on June 28 to all shareholders of record on June 11.

While Nvidia's dividend yield is modest compared to its tech peers, its considerable cash flow and strong balance sheet provide ample room for growth.

Dominance in AI and Data Center Markets Fuels Unprecedented Growth Opportunities

NVDA is strategically positioned at the forefront of the AI and data center markets, with a high demand for AI chips for data processing, training, and inference from large cloud service providers, GPU-specialized ones, enterprise software, and consumer internet companies. In addition, vertical industries, led by automotive, financial services, and healthcare, drive the demand.

Statista projects the generative AI (GenAI) market to reach $36.06 billion in 2024, with the U.S. accounting for the largest market size of $11.66 billion. Further, the GenAI market is expected to total $356.10 billion by 2030, expanding at a CAGR of 46.5% from 2024 to 2030.

Over the past year, Nvidia has experienced a significant surge in sales due to robust demand from tech giants like Google, Microsoft Corporation (MSFT), Meta Platforms, Inc. (META), Amazon, and OpenAI, who invested billions of dollars in Nvidia’s advanced GPUs essential for developing and deploying AI applications. In January, META announced a sizable order of 350,000 high-end H100 graphics cards from Nvidia.

As a result, NVDA holds a market share of about 92% in the data center GPU market for generative AI applications.

Bottom Line

NVDA’s recent 10-for-1 stock split has significantly impacted its valuation and market appeal. This strategic move not only made Nvidia's shares more accessible to retail investors but also fueled increased trading activity and demand, driving share prices higher. The stock surged tenfold on Friday when the stock split took effect, reflecting the heightened investor interest.

NVIDIA's strong financial performance, as evidenced by the fiscal 2025 first quarter report, further solidifies its position in the AI and data center market. The company reported threefold revenue growth, driven by the massive demand for its AI processors from major tech companies, including Microsoft, Meta, Amazon, Google, and OpenAI.

The chipmaker’s remarkable growth has propelled it to the third-largest market capitalization globally, surpassing peers such as AMZN and META.

Further, the company’s revenue and EPS for the fiscal year ending January 2025 are expected to grow 97.9% and 108.9% year-over-year to $120.55 billion and $27.07, respectively. For the fiscal year 2026, Analysts expect its revenue and EPS to increase 32.4% and 32.6% from the prior year to $159.55 billion and $35.90, respectively. With a healthy outlook for the future, NVDA continues to attract investors looking for long-term growth opportunities.

Moreover, the recent decision to raise dividends by 150% showcases NVDA's confidence in its financial strength and growth prospects, making it more attractive to income-oriented investors. This move, coupled with the stock split, appeals to different investor demographics and reflects NVDA's commitment to rewarding shareholders while positioning itself for future growth in the AI and semiconductor sectors.

Intel's AI Ambitions: A Strategic Shift Toward Private Data Storage Solutions

Intel Corporation (INTC), a titan in the world of semiconductors, is navigating a period of transformative change that is revolutionizing its corporate culture and product development. Traditionally, Intel’s core offerings have been microprocessors that serve as the brains of desktop PCs, laptops and tablets, and servers. These processors are silicon wafers embedded with millions or billions of transistors, each acting as binary switches that form the fundamental ‘ones and zeros’ of computer operations.

Today, the thirst for enhanced processing power is insatiable. The proliferation of Artificial Intelligence (AI), which has become integral to essential business operations across almost every sector, exponentially increases the need for robust computing capabilities. AI, particularly neural networks, necessitates enormous computing power and thrives on the collaborative efforts of multiple computing systems. The scope of these AI applications extends far beyond the PCs and servers that initially cemented INTC’s status as an industry leader.

The rapid advancement of AI has prompted Intel to rethink and innovate its chip designs and functionalities. As a result, the company is developing new software and designing interoperable chips while exploring external partnerships to accelerate its adaptation to the evolving computing environment.

Strategic Pivot Toward AI Ecosystem

At Computex 2024, INTC unveiled a series of groundbreaking AI-related announcements, showcasing the latest technologies that merge cutting-edge performance with power efficiency (especially in data centers and for AI on personal computers). The company aims to make AI cheaper and more accessible for everyone.

Intel CEO Pat Gelsinger emphasized how AI is changing the game, stating, “The magic of silicon is once again enabling exponential advancements in computing that will push the boundaries of human potential and power the global economy for years to come.”

In just six months, Intel achieved a lot, transitioning from launching 5th Gen Intel® Xeon® processors to introducing the pioneering Xeon 6 series. The company also previewed Gaudi AI accelerators, offering enterprise clients a cost-effective GenAI training and inference system. Furthermore, Intel has spearheaded the AI PC revolution by integrating Intel® Core™ Ultra processors in over 8 million devices while teasing the upcoming client architecture slated for release later this year.

These strides underscore Intel's commitment to accelerating execution and driving innovation at an unprecedented pace to democratize AI and catalyze industries.

Strategic Pricing and Availability of Its Gaudi AI Accelerators

Intel is gearing up to launch the third generation of its Gaudi AI accelerators later this year, aiming to address a backlog of around $2 billion related to AI chips. However, the company anticipates generating only about $500 million in Gaudi 3 sales in 2024, possibly due to supply constraints.

To broaden the availability of Gaudi 3 systems, Intel is expanding its network of system providers. The company is now collaborating with Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron alongside existing partners like Dell Technologies Inc. (DELL), Hewlett Packard Enterprise Co (HPE), Lenovo Group (LNVGY), and Super Micro Computer, Inc. (SMCI), to ensure Gaudi 3 systems are available far and wide once they hit the market.

But what caught attention at Intel's announcement was the company's attractive pricing strategy. Kits featuring eight Gaudi 2 AI chips and a universal baseboard will cost $65,000, while the version with eight Gaudi 3 AI chips will be priced at $125,000. These prices are estimated to be one-third and two-thirds of the cost of comparable competitive platforms, respectively.

While undercutting Nvidia Corporation (NVDA) on price, INTC expects its chips to deliver impressive performance. According to their estimates, a cluster of 8,192 Gaudi 3 chips can train AI models up to 40% faster than NVDA's H100 chips. Additionally, Gaudi 3 offers up to double the AI inferencing performance of the H100 when running popular large language models (LLMs).

Intel Continues to Ride with 500+ Optimized Models on Core Ultra Processors

In May, INTC announced that over 500 AI models now run optimized on new Intel® Core™ Ultra processors. These processors, known for their advanced AI capabilities, immersive graphics, and optimal battery life, mark a significant milestone in Intel's AI PC transformation efforts.

This achievement stems from Intel's investments in client AI, framework optimizations, and tools like the OpenVINO™ toolkit. The 500+ AI models cover various applications, including large language models, super-resolution, object detection, and computer vision, and are available across popular industry platforms.

The Intel Core Ultra processor is the fastest-growing AI PC processor and the most robust platform for AI PC development. It supports a wide range of AI models, frameworks, and runtimes, making it ideal for AI-enhanced software features like object removal and image super-resolution. This milestone underscores Intel's commitment to advancing AI PC technology, offering users a broad range of AI-driven functionalities for enhanced computing experiences.

Robust Financial Performance and Outlook

Buoyed by solid innovation across its client, edge, and data center portfolios, the company delivered a solid financial performance, driving double-digit revenue growth in its products. Total Intel Products chalked up $11.90 billion in revenue for the first quarter of 2024 (ended March 30), resulting in a 17% year-over-year increase over the prior year’s period. Revenue from the Client Computing Group (CCG) rose 31% year-over-year.

INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Intel’s Data Center and AI (DCAI) division, which offers server chips, saw sales uptick 5% to $3.04 billion.

Also, the company reported a non-GAAP operating income of $723 million, compared to an operating loss of $294 million in the prior year’s quarter. Further, its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18 versus a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter last year.

For the second quarter, Intel expects its revenue to come between $12.5 billion and $13.5 billion, while its non-GAAP earnings per share is expected to be $0.10.

Bottom Line

Despite vital innovations and solid financial performance, INTC’s shares have lost nearly 40% year-to-date and more than 3% over the past 12 months. However, with over 5 million AI PCs shipped since the December 2023 launch of Intel Core Ultra processors, supported by over 100 software vendors, the company expects to exceed its forecast of 40 million AI PCs by the end of 2024.

With the growing demand for AI chips, INTC could see a significant increase in Gaudi chip sales next year as customers look for cost-effective alternatives to NVDA's market-leading products. Moreover, if Intel's reasonable pricing resonates with prospective customers, the company could capture significant market share from its competitors.

Investing in AI: Should You Bet on AMD, Broadcom, or NVIDIA?

Is NVDA the Top Player in AI Stocks?

Initially famed for gaming GPUs, NVIDIA Corporation (NVDA) has evolved into a leader in data center hardware, spearheading AI advancement. The company’s Hopper GPUs are in high demand, accelerating AI applications from recommendation engines to natural language processing and generative AI large language models like ChatGPT on NVIDIA platforms. At this point, NVDA’s dominance in AI and data center markets is undeniable.

For the first quarter that ended April 28, 2024, Nvidia saw over 3x year-over-year increase to $26.04 billion, a new record level. NVIDIA’s Data Center Group (primarily connected to its AI operations) chalked up $22.60 billion in revenue, resulting in a 23% sequential gain and a massive 427% rise over the same period last year.

The chip giant’s operating income surged 690% from the year-ago value to $16.91 billion. NVIDIA’s non-GAAP net income amounted to $15.24 billion or $6.12 per share, compared to $2.71 billion or $1.09 per share in the previous year’s quarter, respectively.

Buoyed by a robust financial position, NVDA increased its quarterly dividend by 150% from $0.04 per share to $0.10 per share of common stock. The increased dividend is equivalent to $0.01 per share on a post-split basis and will be paid to its shareholders on June 28, 2024.

Moving forward, the company guided for a nice round of $28 billion in revenue for its second quarter of the fiscal year 2025, representing a projected 7.5% sequential gain. Its non-GAAP gross margin is expected to be 75.5%, plus or minus 50 basis points.

Analysts expect NVDA’s revenue for the fiscal 2025 second quarter (ending July 2024) to increase 109.7% year-over-year to $28.32 billion. The consensus EPS estimate of $6.35 for the current quarter indicates a 135.1% improvement year-over-year. Moreover, the company has an excellent earnings surprise history, surpassing the consensus EPS estimates in each of the trailing four quarters.

Nvidia’s comprehensive offerings, from chips to boards, systems, software, services, and supercomputing time, cater to expanding markets and diversify its revenue streams. Moreover, the chipmaker’s shares have surged more than 130% over the past six months and nearly 190% over the past year. NVIDIA's trajectory suggests an unstoppable momentum fueled by AI adoption mirroring a similar upward curve, promising a bright future.

Amid this, do AI stocks Broadcom Inc. (AVGO) and Advanced Micro Devices, Inc. (AMD) stand a chance to be as big as the industry leader, NVIDIA? Let’s fundamentally analyze them to find the answer.

Broadcom Inc. (AVGO)

Broadcom Inc. (AVGO) is emerging as one of Nvidia's toughest rivals in the race for networking revenue, especially as data centers undergo rapid transformation for the AI era. As a global tech leader, AVGO designs, develops, and supplies semiconductor and infrastructure software solutions. The company produces custom AI accelerators for major clients and recently projected $7 billion in sales from its two largest customers in 2024, who are widely believed to be Alphabet Inc. (GOOGL) and Meta Platforms, Inc. (META).

AVGO will announce its fiscal 2024 second-quarter earnings on June 12. Forecasts indicate a 37.4% year-over-year revenue surge to $12 billion, reflecting steady growth and financial resilience. Moreover, analysts expect a 5% uptick in the company’s EPS from the preceding year’s period to $10.84.

Broadcom has consistently exceeded consensus revenue and EPS estimates in each of the trailing four quarters, including the first quarter. Its net revenue increased 34% year-over-year to $11.96 billion, with a triple-digit revenue growth in the Infrastructure Software segment to $4.57 billion. AVGO’s gross margin grew 22.8% from the year-ago value to $7.37 billion.

On top of it, the company’s non-GAAP net income for the three months came in at $5.25 billion or $10.99 per share, up 17.2% and 6.4% year-over-year, respectively. Also, its adjusted EBITDA increased from the prior-year quarter to $7.16 billion.

Looking ahead, the company forecasts nearly $50 billion in revenues for fiscal year 2024, with adjusted EBITDA projected to be approximately 60% of its revenue. The company anticipates a 30% year-over-year surge in networking sales, driven by accelerated deployments of networking connectivity and the expansion of AI accelerators in hyperscalers. It also expects generative AI to account for 25% of semiconductor revenue.

The artificial intelligence megatrend is poised to significantly drive Broadcom's revenue and earnings growth in the upcoming decade. During a recent earnings call, Broadcom CEO Hock Tan emphasized, “Strong demand for our networking products in AI data centers, as well as custom AI accelerators from hyperscalers, are driving growth in our semiconductor segment.”

On May 20, 2024, AVGO announced its latest portfolio of highly scalable, high-performing, low-power 400G PCIe Gen 5.0 Ethernet adapters to revolutionize the data center ecosystem. These products offer an enhanced, open, standards-based Ethernet NIC and switching solution to resolve connectivity bottlenecks as XPU bandwidth and cluster sizes grow rapidly in AI data centers.

Patrick Moorhead, CEO & chief analyst at Moor Insights and Strategy, noted, “As the industry races to deliver generative AI at scale, the immense volumes of data that must be processed to train LLMs require even larger server clusters. Scalable high bandwidth, low latency connectivity is critical for maximizing the performance of these AI clusters.”

He added, “Ethernet presents a compelling case as the networking technology of choice for next-generation AI workloads. The 400G NICs offered by Broadcom, built on its success in delivering Ethernet at scale, offers open connectivity at an attractive TCO for power-hungry AI applications.”

With the company's expanding presence in the AI space, Broadcom stands out as a compelling alternative to major chip companies such as NVDA and AMD. Over the past six months, shares of AVGO have gained more than 42%, and nearly 63% over the past year, making it an attractive addition to your investment portfolio.

Advanced Micro Devices, Inc. (AMD)

Advanced Micro Devices, Inc. (AMD) has been at the forefront of innovation in high-performance computing, graphics, and visualization technologies for decades. While NVDA may be the first name that comes to mind in AI processor sales, AMD has established itself as a formidable competitor in the GPU space, particularly excelling in chips tailored for AI workloads.

However, AMD's influence doesn't stop in hardware; it has been actively expanding its AI software ecosystem. The company recently unveiled the groundbreaking AMD Ryzen™ AI 300 Series processors, featuring the world’s most powerful Neural Processing Unit (NPU). These processors are designed to bring AI capabilities directly to next-gen PCs, promising a future where AI-infused computing is seamlessly integrated into everyday tasks.

Additionally, the next-gen AMD Ryzen™ 9000 Series processors for desktops solidify AMD’s position as a leader in performance and efficiency for gamers, content creators, and prosumers alike.

Moreover, the company’s comprehensive roadmap for the Instinct accelerator series promises an annual cadence of cutting-edge AI performance and memory capabilities across each generation. Beginning with the imminent release of the AMD Instinct MI325X accelerator in Q4 2024, followed by the anticipated launch of the AMD Instinct MI350 series powered by the new AMD CDNA™ 4 architecture in 2025, AMD is poised to deliver up to a 35x increase in AI inference performance compared to its previous iterations.

In the first quarter that ended March 30, 2024, AMD’s non-GAAP revenue increased 2.2% year-over-year to $5.47 billion. Both its Data Center and Client segments experienced substantial growth, each exceeding 80% year-over-year, fueled by the uptake of MI300 AI accelerators and the popularity of Ryzen and EPYC processors.

Moreover, the company’s non-GAAP operating income grew 3.2% from the year-ago value to $1.13 billion. Its non-GAAP net income and earnings per share rose 4.4% and 3.3% from the prior-year quarter to $1.01 billion and $0.62, respectively.

AMD expects its revenue in the second quarter of 2024 to be around $5.7 billion, with a projected growth of 6% year-over-year and 4% sequentially. Meanwhile, its non-GAAP gross margin is expected to be around 53%.

Street expects AMD’s revenue for the second quarter (ending June 2024) to increase 6.7% year-over-year to $5.72 billion. Its EPS for the ongoing quarter is projected to reach $0.68, registering a 17% year-over-year growth. Moreover, the company surpassed the consensus revenue estimates in each of the trailing four quarters.

While Nvidia’s Data Center segment reported a sales run rate of $90 billion in the last quarter alone, experts predict that the company could surpass the $100 billion mark in Data Center sales with this momentum. In contrast, AMD's recent guidance forecasts sales of $3.5 billion for its MI300 AI chips in 2024. There’s still a sizable gap between NVIDIA and AMD in AI revenue. To put things into perspective, NVDA's networking revenue alone is approximately four times larger than AMD's total AI chip sales.

Nonetheless, AMD is poised to drive AI innovation across various domains with a diverse portfolio spanning cloud, edge, client, and beyond. The stock has gained more than 55% and 39% over the past nine months and a year, respectively.

Bottom Line

With the global artificial intelligence (AI) market projected to soar from $214.6 billion in 2024 to $1.34 trillion by 2030 (exhibiting a CAGR of 35.7%), leading chip companies, including NVIDIA, Broadcom, and Advanced Micro Devices, are rapidly expanding their market presence, vying for a piece of the pie.

Given their solid fundamentals and promising long-term outlooks, NVDA, AVGO, and AMD appear in good shape to thrive in the foreseeable future. Thus, investors can place their bets on these stocks to garner profitable returns and capitalize on the upward curve of AI.

Why Nvidia’s Stock Split Could Drive Further Market Gains

NVIDIA Corporation (NVDA) shares topped a record high of $1000 in a post-earnings rally. Last week, the company reported fiscal 2025 first-quarter results that beat analyst expectations for revenue and earnings, reinforcing investor confidence in the AI-driven boom in chip demand. Moreover, the stock has surged nearly 120% over the past six months and more than 245% over the past year.

Meanwhile, the chipmaker announced a 10-for-1 forward stock split of NVIDIA’s issued common stock, making stock ownership more accessible to employees and investors.

Let's delve deeper into how NVIDIA’s stock split decision could attract more investors and propel future gains.

The AI Chip Leader

NVDA’s prowess in AI and semiconductor technology has been nothing short of remarkable. Its GPUs (Graphics Processing Units) have become synonymous with cutting-edge AI applications, from powering self-driving cars and training and deploying LLMs to revolutionizing healthcare diagnostics and e-commerce recommendation systems.

Amid a rapidly evolving technological landscape, NVIDIA has consistently remained at the forefront, driving innovation and redefining industry standards. Led by Nvidia, the U.S. dominates the generative AI tech market. ChatGPT’s launch in November 2022 played a pivotal role in catalyzing the “AI boom.”

NVDA holds a market share of about 92% in the data center GPU market for generative AI applications. The company’s chips are sought after by several tech giants for their diverse applications and high performance, including Amazon (AMZN), Meta Platforms, Inc. (META), Microsoft Corporation (MSFT), Alphabet Inc. (GOOGL), and Tesla, Inc. (TSLA).

Nvidia surpassed analyst estimates for revenue and earnings in the first quarter of fiscal 2025, driven by robust demand for its AI chips. In the first quarter that ended April 28, 2024, NVIDIA’s revenue rose 262% year-over-year to $26.04 billion. That topped analysts’ revenue expectations of $24.59 billion. The company reported a record revenue from its Data Center segment of $22.60 billion, up 427% from the prior year’s quarter.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, founder and CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI,” Huang added. 

NVDA’s non-GAAP gross profit grew 328.2% from the year-ago value to $20.56 billion. The company’s non-GAAP operating income was $18.06 billion, an increase of 491.7% from the prior year’s quarter. Its non-GAAP net income rose 461.7% year-over-year to $15.24 billion.

Furthermore, the chipmaker reported non-GAAP EPS of $6.12, compared to the consensus estimate of $5.58, and up 461.5% year-over-year.

Nvidia’s Stock Split: A Strategic Move

Alongside an outstanding fiscal 2025 first-quarter earnings, NVDA announced a 10-for-1 stock split of its issued common stock. Nvidia’s decision to split its stock aligns with a broader trend among tech giants to make their shares more appealing to a wider range of investors, particularly retail investors. The chipmaker aims to democratize ownership and attract a vast investor base by breaking down the barrier of high share prices.

As more individual investors gain access to Nvidia’s shares post-stock split, we could see heightened trading activity and increased demand, potentially exerting upward pressure on its share prices. This strategic move reflects the confidence of NVIDIA’s management in its future growth trajectory and underscores its commitment to inclusivity in the investment landscape.

Bank of America analysts, led by Jared Woodward, head of the bank’s research investment committee, described the share split as “another large-cap tech pursuing shareholder-friendly policies” in a note to clients.

NVIDIA marks the fourth Magnificent Seven big tech companies to announce a stock split since 2022, following Google, Amazon, and Tesla’s efforts to make shares more accessible, according to Woodward and his team.

In recent years, as the share prices of several Big Tech companies surged past the $500 mark, it has become challenging for retail investors to buy shares. Consequently, these companies have been exploring ways to simplify the process for nonprofessional investors to buy in. BofA added, “Big Tech is going bite-sized” to lure retail investors, which might signal more market-beating returns.

Historical Data Suggests That Stock Splits Indicate a Bullish Outlook

Examining historical data on stock splits reveals a generally positive picture. While immediate post-split gains aren’t guaranteed, companies like Apple Inc. (AAPL) and Google have witnessed substantial appreciation in their share prices following splits. AAPL’s 4-for-1 stock split, which took effect in August 2020, primarily influenced investor sentiment and trading dynamics.

Following the split, Apple’s stock continued its upward trajectory, driven by solid performance in its core businesses, including iPhone sales, services revenue, and wearables. Throughout the latter half of 2020 and into 2021, its share price experienced significant appreciation, reaching new all-time highs.

Given NVIDIA’s robust fundamentals and leadership in AI and semiconductor technology, there’s reason to believe that its recent stock split could lead to similar outcomes.

BofA’s sell-side analysts have consistently been bullish on Nvidia shares, and following the first-quarter earnings release, they raised their lofty 12-month price target for the chip giant from $1,100 to $1,320. If the outlook proves accurate, Nvidia shares could surge by another 26%, and the stock split could support that bullish move, as per Bank of America’s reading of history.

“Splits have boosted returns in every decade, including the early 2000s when the S&P 500 struggled,” noted Woodard and his team. BofA’s research indicates that stocks have delivered 25% total returns within the 12 months following a stock split historically, compared to the S&P 500’s 12%.

Further, the bank highlighted that stock splits often ignite bullish runs, even in stocks that have been underperforming. For example, both Advanced Micro Devices, Inc. (AMD) and Valero Energy Corporation (VLO) experienced significant share price increases after announcing stock splits despite their prior poor performance. According to analysts, “Since gains are more common and larger than losses on average, splits appear to introduce upside potential into markets.”

However, it's essential to heed the standard caveat the Securities and Exchange Commission (SEC) provided: “Past performance is not indicative of future results.” In line, Bank of America emphasized that “outperformance is no guarantee” after a stock split. Companies still witness negative returns 30% of the time following a split, with an average decline of 22% over the subsequent 12 months.

The analysts noted, “While splits could be an indication of strong momentum, companies can struggle in a challenging macro environment.” They pointed to companies like Amazon, Google, and Tesla that faced difficulties in the 12 months following their stock splits in 2022 due to a high interest-rate environment.

Bottom Line

NVDA has a significant role as a global leader in AI and semiconductor technology, with its GPUs driving innovations across numerous industries, such as tech, automobile, healthcare, and e-commerce. Nvidia’s fiscal 2025 first-quarter results suggest that demand for its AI chips remains robust.

Statista projects the global generative AI market to reach $36.06 billion in 2024. This year, the U.S. is expected to maintain its position as the leader in AI market share, with a total of $11.66 billion. Further, the market is estimated to grow at a CAGR of 46.5%, resulting in a market volume of $356.10 billion by 2030. The AI market’s bright outlook should bode well for NVDA.

The company also recently made headlines with its announcement to undergo a 10-for-1 stock split. While stock splits generally do not change the fundamental value of a company, they make its shares more accessible and attractive to retail investors. So, the recent stock split could significantly increase retail participation, driving heightened trading activity and potentially exerting upward pressure on Nvidia’s share prices.

Historically, stock splits generally indicate a positive impact on stock performance. Companies like AAPL, GOOGL, and AMD experienced substantial price appreciation after stock splits, with enhanced accessibility to retail investors driving higher demand and liquidity.

However, it is crucial to acknowledge that past performance is not indicative of future results. While stock splits can signal strong price momentum, they do not guarantee outperformance.

In conclusion, Nvidia’s stock split will likely attract more retail investors, potentially boosting increased trading activity and stock price appreciation. Coupled with the company’s strong position in the AI and semiconductor markets, the stock split could facilitate further growth, aligning with historical trends of positive post-split performance.