Nvidia’s GPUs a Game-Changer for Investors?

NVIDIA Corporation (NVDA), a tech giant advancing AI through its cutting-edge graphics processing units (GPUs), became the third U.S. company to exceed a staggering market capitalization of $3 trillion in June, after Microsoft Corporation (MSFT) and Apple Inc. (AAPL). This significant milestone marks nearly a doubling of its value since the start of the year. Nvidia’s stock has surged more than 159% year-to-date and around 176% over the past year.

What drives the company’s exceptional growth, and how do Nvidia GPUs translate into significant financial benefits for cloud providers and investors? This piece will explore the financial implications of investing in NVIDIA GPUs, the impressive ROI metrics for cloud providers, and the company’s growth prospects in the AI GPU market.

Financial Benefits of NVDA’s GPUs for Cloud Providers

During the Bank of America Securities 2024 Global Technology Conference, Ian Buck, Vice President and General Manager of NVDA’s hyperscale and HPC business, highlighted the substantial financial benefits for cloud providers by investing in NVIDIA GPUs.

Buck illustrated that for every dollar spent on NVIDIA GPUs, cloud providers can generate five dollars over four years. This return on investment (ROI) becomes even more impressive for inferencing tasks, where the profitability rises to seven dollars per dollar invested over the same period, with this figure continuing to increase.

This compelling ROI is driven by the superior performance and efficiency of Nvidia’s GPUs, which enable cloud providers to offer enhanced services and handle more complex workloads, particularly in the realm of AI. As AI applications expand across various industries, the demand for high-performance inference solutions escalates, further boosting cloud providers’ financial benefits utilizing NVIDIA’s technology.

NVDA’s Progress in AI and GPU Innovations

NVIDIA’s commitment to addressing the surging demand for AI inference is evident in its continuous innovation and product development. The company introduced cutting-edge products like NVIDIA Inference Microservices (NIMs), designed to support popular AI models such as Llama, Mistral, and Gemma.

These optimized inference microservices for deploying AI models at scale facilitate seamless integration of AI capabilities into cloud infrastructures, enhancing efficiency and scalability for cloud providers.

In addition to NIMs, NVDA is also focusing on its new Blackwell GPU, engineered particularly for inference tasks and energy efficiency. The upcoming Blackwell model is expected to ship to customers later this year. While there may be initial shortages, Nvidia remains optimistic. Buck noted that each new technology phase brings supply and demand challenges, as they experienced with the Hopper GPU.

Furthermore, the early collaboration with cloud providers on the forthcoming Rubin GPU, slated for a 2026 release, underscores the company’s strategic foresight in aligning its innovations with industry requirements.

Nvidia’s GPUs Boost its Stock Value and Earnings

The financial returns of investing in Nvidia GPUs benefit cloud providers considerably and have significant implications for NVDA’s stock value and earnings. With a $4 trillion market cap within sight, the chip giant’s trajectory suggests continued growth and potential for substantial returns for investors.

NVDA’s first-quarter 2025 earnings topped analysts’ expectations and exceeded the high bar set by investors, as Data Center sales rose to a record high amid booming AI demand. For the quarter that ended April 28, 2024, the company posted a record revenue of $26 billion, up 262% year-over-year. That compared to the consensus revenue estimate of $24.56 billion.

The chip giant’s quarterly Data Center revenue was $22.60 billion, an increase of 427% from the prior year’s quarter. Its non-GAAP operating income rose 492% year-over-year to $18.06 billion. NVIDIA’s non-GAAP net income grew 462% from the prior year’s quarter to $15.24 billion. In addition, its non-GAAP EPS came in at $6.12, up 461% year-over-year.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI. Spectrum-X opens a brand-new market for us to bring large-scale AI to Ethernet-only data centers. And NVIDIA NIM is our new software offering that delivers enterprise-grade, optimized generative AI to run on CUDA everywhere — from the cloud to on-prem data centers and RTX AI PCs — through our expansive network of ecosystem partners,” Huang added.

According to its outlook for the second quarter of fiscal 2025, Nvidia’s revenue is anticipated to be $28 billion, plus or minus 2%. The company expects its non-GAAP gross margins to be 75.5%. For the full year, gross margins are projected to be in the mid-70% range.

Analysts also appear highly bullish about the company’s upcoming earnings. NVDA’s revenue and EPS for the second quarter (ending July 2024) are expected to grow 110.5% and 135.5% year-over-year to $28.43 billion and $0.64, respectively. For the fiscal year ending January 2025, Street expects the chip company’s revenue and EPS to increase 97.3% and 111.1% year-over-year to $120.18 billion and $2.74, respectively.

Robust Future Growth in the AI Data Center Market

The exponential growth of AI use cases and applications across various sectors—ranging from healthcare and automobile to retail and manufacturing—highlights the critical role of GPUs in enabling these advancements. NVIDIA’s strategic investments in AI and GPU technology and its emphasis on collaboration with cloud providers position the company at the forefront of this burgeoning AI market.

As Nvidia’s high-end server GPUs are essential for training and deploying large AI models, tech giants like Microsoft and Meta Platforms, Inc. (META) have spent billions of dollars buying these chips. Meta CEO Mark Zuckerberg stated his company is “building an absolutely massive amount of infrastructure” that will include 350,000 H100 GPU graphics cards to be delivered by NVDA by the end of 2024.

NVIDIA’s GPUs are sought after by several other tech companies for superior performance, including Amazon, Microsoft Corporation (MSFT), Alphabet Inc. (GOOGL), and Tesla, Inc. (TSLA).

Notably, NVDA owns a 92% market share in data center GPUs. Led by Nvidia, U.S. tech companies dominate the burgeoning market for generative AI, with market shares of 70% to over 90% in chips and cloud services.

According to the Markets and Markets report, the data center GPU market is projected to value more than $63 billion by 2028, growing at an impressive CAGR of 34.6% during the forecast period (2024-2028). The rapidly rising adoption of data center GPUs across cloud providers should bode well for Nvidia.

Bottom Line

NVDA’s GPUs represent a game-changer for both cloud providers and investors, driven by superior performance and a compelling return on investment (ROI). The attractive financial benefits of investing in NVIDIA GPUs underscore their value, with cloud providers generating substantial profits from enhanced AI capabilities. This high ROI, particularly in AI inferencing tasks, positions Nvidia as a pivotal player in the burgeoning AI data center market, reinforcing its dominant market share and driving continued growth.

Moreover, Wall Street analysts remain bullish about this AI chipmaker’s prospects. TD Cowen analyst Matthew Ramsay increased his price target on NVDA stock from $140 to $165, while maintaining the Buy rating. “One thing remains the same: fundamental strength at Nvidia,” Ramsay said in a client note. “In fact, our checks continue to point to upside in data center (sales) as demand for Hopper/Blackwell-based AI systems continues to exceed supply.”

“Overall we see a product roadmap indicating a relentless pace of innovation across all aspects of the AI compute stack,” Ramsay added.

Meanwhile, KeyBanc Capital Markets analyst John Vinh reiterated his Overweight rating on NVIDIA stock with a price target of $180. “We expect Nvidia to deliver higher results and higher guidance” with its second-quarter 2025 report, Vinh said in a client note. He added solid demand for generative AI will drive the upside.

As AI applications expand across various key industries, NVIDIA’s continuous strategic innovations and product developments, such as the Blackwell GPU and NVIDIA Inference Microservices, ensure the company remains at the forefront of technological advancement. With a market cap nearing $4 trillion and a solid financial outlook, NVIDIA is well-poised to deliver substantial returns for investors, solidifying its standing as a leader in the AI and GPU technology sectors.

Intel's $8.5 Billion Gamble: Can It Rival Nvidia?

Intel Corporation (INTC), a leading player in the semiconductor industry, is making headlines with its ambitious plans to transform its operations, spurred by a substantial $8.5 billion boost from the CHIPS and Science Act. The roughly $280 billion legislative package, signed into law by President Joe Biden in 2022, aims to bolster U.S. semiconductor manufacturing and research and development (R&D) capabilities.

CHIPS Act funding will help advance Intel’s commercial semiconductor projects at key sites in Arizona, New Mexico, Ohio, and Oregon. Also, the company expects to benefit from a U.S. Treasury Department Investment Tax Credit (ITC) of up to 25% on over $100 billion in qualified investments and eligibility for federal loans up to $11 billion.

Previously, CHIPS Act funding and INTC announced plans to invest more than $1100 billion in the U.S. over five years to expand chipmaking capacity critical to national security and the advancement of cutting-edge technologies, including artificial intelligence (AI).

Notably, Intel is the sole American company that both designs and manufactures leading-edge logic chips. Its strategy focuses on three pillars: achieving process technology leadership, constructing a more resilient and sustainable global semiconductor supply chain, and developing a world-class foundry business. These goals align with the CHIPS Act’s objectives to restore manufacturing and technological leadership to the U.S.

The federal funding represents a pivotal opportunity for INTC to reclaim its position as a chip manufacturing powerhouse, potentially rivaling giants like NVIDIA Corporation (NVDA) and Advanced Micro Devices, Inc. (AMD).

Intel’s Strategic Initiatives to Capitalize on AI Boom

At Computex 2024, INTC introduced cutting-edge technologies and architectures that are well-poised to significantly accelerate the AI ecosystem, from the data center, cloud, and network to the edge and PC.

The company launched Intel® Xeon® 6 processors with E-core (Efficient-core) and P-core (Performance-core) SKUs, delivering enhanced performance and power efficiency for high-density, scale-out workloads in the data center. The first of the Xeon 6 processors debuted is the Intel Xeon 6 E-core (code-named Sierra Forest), available beginning June 4. Further, Xeon 6 P-cores (code-named Granite Rapids) are expected to launch next quarter.

Beyond the data center, Intel is expanding its AI footprint in edge computing and PCs. With over 90,000 edge deployments and 200 million CPUs distributed across the ecosystem, the company has consistently enabled enterprise choice for many years. INTC revealed the architectural details of Lunar Lake, the flagship processor for the next generation of AI PCs.

Lunar Lake is set to make a significant leap in graphics and AI processing capabilities, emphasizing power-efficient compute performance tailored for the thin-and-light segment. It promises up to a 40% reduction in System-on-Chip (SoC) power3 and over three times the AI compute8. It is scheduled for release in the third quarter of 2024, in time for the holiday shopping season.

Also, Intel unveiled pricing for Intel® Gaudi® 2 and Intel® Gaudi® 3 AI accelerator kits, providing high performance at up to one-third lower cost compared to competitive platforms. A standard AI kit, including Intel Gaudi 2 accelerators with a UBB, is offered to system providers at $65,000. Integrating Xeon processors with Gaudi AI accelerators in a system presents a robust solution to make AI faster, cheaper, and more accessible.

Intel CEO Pat Gelsinger said, “Intel is one of the only companies in the world innovating across the full spectrum of the AI market opportunity – from semiconductor manufacturing to PC, network, edge and data center systems. Our latest Xeon, Gaudi and Core Ultra platforms, combined with the power of our hardware and software ecosystem, are delivering the flexible, secure, sustainable and cost-effective solutions our customers need to maximize the immense opportunities ahead.”

On May 1, INTC achieved a significant milestone of surpassing 500 AI models running optimized on new Intel® Core™ Ultra processors due to the company’s investment in client AI, the AI PC transformation, framework optimizations, and AI tools like OpenVINO™ toolkit. These processors are the industry’s leading AI PC processors, offering enhanced AI experiences, immersive graphics, and optimized battery life.

Solid First-Quarter Performance and Second-Quarter Guidance

During the first quarter that ended March 30, 2024, INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Revenue from the Client Computing Group (CCG), through which Intel continues to advance its mission to bring AI everywhere, rose 31% year-over-year to $7.50 billion.

Furthermore, the company’s non-GAAP operating income was $723 million, compared to an operating loss of $294 million in the previous year’s quarter. Its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18, compared to a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter of 2023.

For the second quarter of fiscal 2024, Intel expects its revenue to come between $12.5 billion and $13.5 billion, and its non-GAAP earnings per share is expected to be $0.10.

Despite its outstanding financial performance and ambitious plans, INTC’s stock has plunged more than 38% over the past six months and nearly 40% year-to-date.

Competing with Nvidia: A Daunting Task

Despite INTC’s solid financial health and strategic moves, the competition with NVDA is fierce. Nvidia’s market performance has been stellar lately, driven by its global leadership in graphics processing units (GPUs) and its foray into AI and machine learning markets. The chip giant has built strong brand loyalty among developers and enterprise customers, which could be challenging for Intel to overcome.

Over the past year, NVIDIA has experienced a significant surge in sales due to high demand from tech giants such as c, Alphabet Inc. (GOOGL), Microsoft Corporation (MSFT), Meta Platforms, Inc. (META), and OpenAI, who invested billions of dollars in its advanced GPUs essential for developing and deploying AI applications.

Shares of the prominent chipmaker surged approximately 150% over the past six months and more than 196% over the past year. Moreover, NVDA’s stock is up around 2,938% over the past five years. Notably, after Amazon and Google, Nvidia recently became the third U.S. company with a market value surpassing $3 trillion.

As a result, NVDA commands a dominant market share of about 92% in the data center GPU market. Nvidia’s success stems from its cutting-edge semiconductor performance and software prowess. The CUDA development platform, launched in 2006, has emerged as a pivotal tool for AI development, with a user base exceeding 4 million developers.

Bottom Line

Proposed funding of $8.5 billion, along with an investment tax credit and eligibility for CHIPS Act loans, are pivotal in Intel’s bid to regain semiconductor leadership in the face of intense competition, particularly from Nvidia. This substantial federal funding will enhance Intel’s manufacturing and R&D capabilities across its key sites in Arizona, New Mexico, Ohio, and Oregon.

While INTC possesses the resources, technological expertise, and strategic vision to challenge NVDA, the path forward is fraught with challenges. Despite Intel’s recent strides in the AI ecosystem, from the data center to edge and PC with products like Xeon 6 processors and Gaudi AI accelerators, Nvidia’s dominance in data center GPUs remains pronounced, commanding a significant market share.

Future success will depend on Intel’s ability to leverage its strengths in manufacturing, introducing innovative product lines, and cultivating a compelling ecosystem of software and developer support. As Intel advances its ambitious plans, industry experts and stakeholders will keenly watch how these developments unfold, redefining the competitive landscape in the AI and data center markets.

How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.

The Future of NVIDIA: Post-Split Valuation and Growth Projections

NVIDIA Corporation (NVDA), a prominent force in the AI and semiconductor technology industries, announced a 10-for-1 forward stock split of the company’s issued common stock during its last earnings release in May. Shareholders of record as of June 6 received nine additional shares for each share held after the close on Friday, June 7. Trading will commence on a split-adjusted basis at market open on June 10.

This strategic move is poised to reshape the landscape for Nvidia investors and the broader tech market.

Post-Split Valuation

NVDA was already a leading AI stock in the market, but investor interest in the chipmaker skyrocketed as its 10-for-1 stock split took effect after the market’s close on June 7. Shares of the hottest stock on the S&P 500 surged tenfold on Friday following its much-anticipated stock split.

Moreover, NVIDIA’s stock has gained more than 158% over the past six months and nearly 222% over the past year. Notably, the stock is up over 3,222% over the past five years. During this remarkable run, Nvidia’s market cap of around $3 trillion surpassed those of Amazon (AMZN) and Alphabet Inc. (GOOGL). Before the 10-for-1 split, the stock traded at a lofty $1,209.

The chip giant’s strategic decision to split its stock follows a broader trend among tech giants to make their stock ownership more affordable and appealing to retail investors. With more individual investors gaining access to Nvidia’s shares post-split, increased trading activity and demand are observed, potentially driving share prices higher.

According to data from BofA research, total returns for companies announcing stock splits are about 25% in the 12 months after a stock split historically versus 12% gains for the S&P 500. Thus, stock splits are seen as a bullish signal, often accompanied by positive investor sentiment and increased buying activity.

Solid Earnings And A Healthy Outlook

The stock split isn’t the only reason for NVDA’s latest bull run. The company also reported better-than-expected revenue and earnings in the fiscal 2025 first quarter, driven by robust demand for its AI chips. During the quarter that ended April 28, 2024, Nvidia’s revenue rose 262% year-over-year to $26.04 billion. That surpassed the consensus revenue estimate of $24.59 billion.

The company’s largest business segment, Data Center, which includes its AI chips and several additional parts required to run big AI servers, reported a record revenue of $22.60 billion, up 427% year-over-year.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, founder and CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI,” Huang added. During a call with analysts, the CEO mentioned that there would be significant Blackwell revenue this year and that the new chip would be deployed in data centers by the fourth quarter.

The chipmaker’s non-GAAP gross profit grew 328.2% from the previous year’s quarter to $20.56 billion. NVDA’s non-GAAP operating income was $18.06 billion, an increase of 491.7% year-over-year. Its non-GAAP net income rose 461.7% year-over-year to $15.24 billion. Also, it posted a non-GAAP EPS of $6.12, compared to analysts’ estimate of $5.58, and up 461.5% year-over-year.

Furthermore, NVIDIA’s cash, cash equivalents and marketable securities were $31.44 billion as of April 28, 2024, compared to $25.98 billion as of January 28, 2024.

According to its outlook for the second quarter of 2025, the company expects revenue to be $28 billion, plus or minus 2%. Its non-GAAP gross margin is expected to be 75.5%, plus or minus 50 basis points. NVDA’s non-GAAP operating expenses are anticipated to be approximately $2.8 billion.

Raised Dividends

NVDA raised its dividend payouts to reward shareholders and demonstrate confidence in its financial strength and growth prospects. The company increased its quarterly cash dividend by 150% from $0.04 per share to $0.10 per share of common stock. The dividend is equivalent to $0.01 per share on a post-split basis and will be paid on June 28 to all shareholders of record on June 11.

While Nvidia's dividend yield is modest compared to its tech peers, its considerable cash flow and strong balance sheet provide ample room for growth.

Dominance in AI and Data Center Markets Fuels Unprecedented Growth Opportunities

NVDA is strategically positioned at the forefront of the AI and data center markets, with a high demand for AI chips for data processing, training, and inference from large cloud service providers, GPU-specialized ones, enterprise software, and consumer internet companies. In addition, vertical industries, led by automotive, financial services, and healthcare, drive the demand.

Statista projects the generative AI (GenAI) market to reach $36.06 billion in 2024, with the U.S. accounting for the largest market size of $11.66 billion. Further, the GenAI market is expected to total $356.10 billion by 2030, expanding at a CAGR of 46.5% from 2024 to 2030.

Over the past year, Nvidia has experienced a significant surge in sales due to robust demand from tech giants like Google, Microsoft Corporation (MSFT), Meta Platforms, Inc. (META), Amazon, and OpenAI, who invested billions of dollars in Nvidia’s advanced GPUs essential for developing and deploying AI applications. In January, META announced a sizable order of 350,000 high-end H100 graphics cards from Nvidia.

As a result, NVDA holds a market share of about 92% in the data center GPU market for generative AI applications.

Bottom Line

NVDA’s recent 10-for-1 stock split has significantly impacted its valuation and market appeal. This strategic move not only made Nvidia's shares more accessible to retail investors but also fueled increased trading activity and demand, driving share prices higher. The stock surged tenfold on Friday when the stock split took effect, reflecting the heightened investor interest.

NVIDIA's strong financial performance, as evidenced by the fiscal 2025 first quarter report, further solidifies its position in the AI and data center market. The company reported threefold revenue growth, driven by the massive demand for its AI processors from major tech companies, including Microsoft, Meta, Amazon, Google, and OpenAI.

The chipmaker’s remarkable growth has propelled it to the third-largest market capitalization globally, surpassing peers such as AMZN and META.

Further, the company’s revenue and EPS for the fiscal year ending January 2025 are expected to grow 97.9% and 108.9% year-over-year to $120.55 billion and $27.07, respectively. For the fiscal year 2026, Analysts expect its revenue and EPS to increase 32.4% and 32.6% from the prior year to $159.55 billion and $35.90, respectively. With a healthy outlook for the future, NVDA continues to attract investors looking for long-term growth opportunities.

Moreover, the recent decision to raise dividends by 150% showcases NVDA's confidence in its financial strength and growth prospects, making it more attractive to income-oriented investors. This move, coupled with the stock split, appeals to different investor demographics and reflects NVDA's commitment to rewarding shareholders while positioning itself for future growth in the AI and semiconductor sectors.

Investing in AI: Should You Bet on AMD, Broadcom, or NVIDIA?

Is NVDA the Top Player in AI Stocks?

Initially famed for gaming GPUs, NVIDIA Corporation (NVDA) has evolved into a leader in data center hardware, spearheading AI advancement. The company’s Hopper GPUs are in high demand, accelerating AI applications from recommendation engines to natural language processing and generative AI large language models like ChatGPT on NVIDIA platforms. At this point, NVDA’s dominance in AI and data center markets is undeniable.

For the first quarter that ended April 28, 2024, Nvidia saw over 3x year-over-year increase to $26.04 billion, a new record level. NVIDIA’s Data Center Group (primarily connected to its AI operations) chalked up $22.60 billion in revenue, resulting in a 23% sequential gain and a massive 427% rise over the same period last year.

The chip giant’s operating income surged 690% from the year-ago value to $16.91 billion. NVIDIA’s non-GAAP net income amounted to $15.24 billion or $6.12 per share, compared to $2.71 billion or $1.09 per share in the previous year’s quarter, respectively.

Buoyed by a robust financial position, NVDA increased its quarterly dividend by 150% from $0.04 per share to $0.10 per share of common stock. The increased dividend is equivalent to $0.01 per share on a post-split basis and will be paid to its shareholders on June 28, 2024.

Moving forward, the company guided for a nice round of $28 billion in revenue for its second quarter of the fiscal year 2025, representing a projected 7.5% sequential gain. Its non-GAAP gross margin is expected to be 75.5%, plus or minus 50 basis points.

Analysts expect NVDA’s revenue for the fiscal 2025 second quarter (ending July 2024) to increase 109.7% year-over-year to $28.32 billion. The consensus EPS estimate of $6.35 for the current quarter indicates a 135.1% improvement year-over-year. Moreover, the company has an excellent earnings surprise history, surpassing the consensus EPS estimates in each of the trailing four quarters.

Nvidia’s comprehensive offerings, from chips to boards, systems, software, services, and supercomputing time, cater to expanding markets and diversify its revenue streams. Moreover, the chipmaker’s shares have surged more than 130% over the past six months and nearly 190% over the past year. NVIDIA's trajectory suggests an unstoppable momentum fueled by AI adoption mirroring a similar upward curve, promising a bright future.

Amid this, do AI stocks Broadcom Inc. (AVGO) and Advanced Micro Devices, Inc. (AMD) stand a chance to be as big as the industry leader, NVIDIA? Let’s fundamentally analyze them to find the answer.

Broadcom Inc. (AVGO)

Broadcom Inc. (AVGO) is emerging as one of Nvidia's toughest rivals in the race for networking revenue, especially as data centers undergo rapid transformation for the AI era. As a global tech leader, AVGO designs, develops, and supplies semiconductor and infrastructure software solutions. The company produces custom AI accelerators for major clients and recently projected $7 billion in sales from its two largest customers in 2024, who are widely believed to be Alphabet Inc. (GOOGL) and Meta Platforms, Inc. (META).

AVGO will announce its fiscal 2024 second-quarter earnings on June 12. Forecasts indicate a 37.4% year-over-year revenue surge to $12 billion, reflecting steady growth and financial resilience. Moreover, analysts expect a 5% uptick in the company’s EPS from the preceding year’s period to $10.84.

Broadcom has consistently exceeded consensus revenue and EPS estimates in each of the trailing four quarters, including the first quarter. Its net revenue increased 34% year-over-year to $11.96 billion, with a triple-digit revenue growth in the Infrastructure Software segment to $4.57 billion. AVGO’s gross margin grew 22.8% from the year-ago value to $7.37 billion.

On top of it, the company’s non-GAAP net income for the three months came in at $5.25 billion or $10.99 per share, up 17.2% and 6.4% year-over-year, respectively. Also, its adjusted EBITDA increased from the prior-year quarter to $7.16 billion.

Looking ahead, the company forecasts nearly $50 billion in revenues for fiscal year 2024, with adjusted EBITDA projected to be approximately 60% of its revenue. The company anticipates a 30% year-over-year surge in networking sales, driven by accelerated deployments of networking connectivity and the expansion of AI accelerators in hyperscalers. It also expects generative AI to account for 25% of semiconductor revenue.

The artificial intelligence megatrend is poised to significantly drive Broadcom's revenue and earnings growth in the upcoming decade. During a recent earnings call, Broadcom CEO Hock Tan emphasized, “Strong demand for our networking products in AI data centers, as well as custom AI accelerators from hyperscalers, are driving growth in our semiconductor segment.”

On May 20, 2024, AVGO announced its latest portfolio of highly scalable, high-performing, low-power 400G PCIe Gen 5.0 Ethernet adapters to revolutionize the data center ecosystem. These products offer an enhanced, open, standards-based Ethernet NIC and switching solution to resolve connectivity bottlenecks as XPU bandwidth and cluster sizes grow rapidly in AI data centers.

Patrick Moorhead, CEO & chief analyst at Moor Insights and Strategy, noted, “As the industry races to deliver generative AI at scale, the immense volumes of data that must be processed to train LLMs require even larger server clusters. Scalable high bandwidth, low latency connectivity is critical for maximizing the performance of these AI clusters.”

He added, “Ethernet presents a compelling case as the networking technology of choice for next-generation AI workloads. The 400G NICs offered by Broadcom, built on its success in delivering Ethernet at scale, offers open connectivity at an attractive TCO for power-hungry AI applications.”

With the company's expanding presence in the AI space, Broadcom stands out as a compelling alternative to major chip companies such as NVDA and AMD. Over the past six months, shares of AVGO have gained more than 42%, and nearly 63% over the past year, making it an attractive addition to your investment portfolio.

Advanced Micro Devices, Inc. (AMD)

Advanced Micro Devices, Inc. (AMD) has been at the forefront of innovation in high-performance computing, graphics, and visualization technologies for decades. While NVDA may be the first name that comes to mind in AI processor sales, AMD has established itself as a formidable competitor in the GPU space, particularly excelling in chips tailored for AI workloads.

However, AMD's influence doesn't stop in hardware; it has been actively expanding its AI software ecosystem. The company recently unveiled the groundbreaking AMD Ryzen™ AI 300 Series processors, featuring the world’s most powerful Neural Processing Unit (NPU). These processors are designed to bring AI capabilities directly to next-gen PCs, promising a future where AI-infused computing is seamlessly integrated into everyday tasks.

Additionally, the next-gen AMD Ryzen™ 9000 Series processors for desktops solidify AMD’s position as a leader in performance and efficiency for gamers, content creators, and prosumers alike.

Moreover, the company’s comprehensive roadmap for the Instinct accelerator series promises an annual cadence of cutting-edge AI performance and memory capabilities across each generation. Beginning with the imminent release of the AMD Instinct MI325X accelerator in Q4 2024, followed by the anticipated launch of the AMD Instinct MI350 series powered by the new AMD CDNA™ 4 architecture in 2025, AMD is poised to deliver up to a 35x increase in AI inference performance compared to its previous iterations.

In the first quarter that ended March 30, 2024, AMD’s non-GAAP revenue increased 2.2% year-over-year to $5.47 billion. Both its Data Center and Client segments experienced substantial growth, each exceeding 80% year-over-year, fueled by the uptake of MI300 AI accelerators and the popularity of Ryzen and EPYC processors.

Moreover, the company’s non-GAAP operating income grew 3.2% from the year-ago value to $1.13 billion. Its non-GAAP net income and earnings per share rose 4.4% and 3.3% from the prior-year quarter to $1.01 billion and $0.62, respectively.

AMD expects its revenue in the second quarter of 2024 to be around $5.7 billion, with a projected growth of 6% year-over-year and 4% sequentially. Meanwhile, its non-GAAP gross margin is expected to be around 53%.

Street expects AMD’s revenue for the second quarter (ending June 2024) to increase 6.7% year-over-year to $5.72 billion. Its EPS for the ongoing quarter is projected to reach $0.68, registering a 17% year-over-year growth. Moreover, the company surpassed the consensus revenue estimates in each of the trailing four quarters.

While Nvidia’s Data Center segment reported a sales run rate of $90 billion in the last quarter alone, experts predict that the company could surpass the $100 billion mark in Data Center sales with this momentum. In contrast, AMD's recent guidance forecasts sales of $3.5 billion for its MI300 AI chips in 2024. There’s still a sizable gap between NVIDIA and AMD in AI revenue. To put things into perspective, NVDA's networking revenue alone is approximately four times larger than AMD's total AI chip sales.

Nonetheless, AMD is poised to drive AI innovation across various domains with a diverse portfolio spanning cloud, edge, client, and beyond. The stock has gained more than 55% and 39% over the past nine months and a year, respectively.

Bottom Line

With the global artificial intelligence (AI) market projected to soar from $214.6 billion in 2024 to $1.34 trillion by 2030 (exhibiting a CAGR of 35.7%), leading chip companies, including NVIDIA, Broadcom, and Advanced Micro Devices, are rapidly expanding their market presence, vying for a piece of the pie.

Given their solid fundamentals and promising long-term outlooks, NVDA, AVGO, and AMD appear in good shape to thrive in the foreseeable future. Thus, investors can place their bets on these stocks to garner profitable returns and capitalize on the upward curve of AI.