Is CrowdStrike's (CRWD) Stock Drop an Opportunity for Investors?

CrowdStrike Holdings, Inc. (CRWD) experienced a steep decline in its stock price last Friday following a software update that triggered widespread technical outages. This mishap couldn't have come at a worse time for the company, as it is about to close its fiscal quarter at the end of this month, a period when software companies typically finalize major deals. With its shares having surged nearly 34% this year before the drop, CrowdStrike faces intense pressure as it approaches its July-quarter results.

For investors, mere expectation-matching results won't suffice; they will be looking for a strong performance that exceeds expectations and prompts an upward revision in forecasts.

On July 19, a faulty security update from CrowdStrike caused a global tech outage, affecting millions of Windows devices. Social media was abuzz with images of the infamous "blue screen of death," indicating system crashes. CrowdStrike clarified that the issue wasn’t a security breach or cyberattack but a flaw in the update affecting Microsoft Windows systems.

Despite a quick fix, the damage was done, with the stock plummeting over 20% since the incident. Microsoft Corporation (MSFT) reported that about 8.5 million devices were affected, representing less than 1% of all Windows devices. However, the outage had significant repercussions, particularly in sectors that provided essential services like hospitals, banks, and airports. This sharp decline in CRWD’s stock suggests investors are concerned about the long-term impact on CrowdStrike's position in the fiercely competitive cybersecurity market, where even minor missteps can be costly.

There's also concern about the potential financial fallout from the incident, including compensation for affected clients. While CRWD suffered, shares of its competitor Palo Alto Networks Inc. (PANW) rose by 2.8%. It's a central black eye for CrowdStrike, which is now a household name for the wrong reasons.

"This situation could have serious implications for CrowdStrike's business, particularly as it seeks to expand adoption among large enterprises to drive the next phase of growth," said Redburn analyst Nina Marques. “Furthermore, it could have impact on market-share shifts as customers seek alternative security solutions,” she added.

CrowdStrike Gets a Downgrade

Analysts are concerned that the disruptions caused by CrowdStrike's software update will delay new deals. Guggenheim's John DiFucci downgraded the stock from Buy to Neutral and stated that the global chaos caused by CrowdStrike, even if temporary, will likely negatively affect its business. He believes rebuilding its reputation might take time and could affect new business signings in the short term.

Similarly, BTIG's Gray Powell downgraded the stock to Neutral, stating that “the outage may have an impact on new customer wins and create deal delays.” He added that while the issue wasn't a security breach, the disruption violated a key principle for security vendors and might lead to demands for larger discounts or credits from existing customers.

What to Expect From CrowdStrike’s Next Earnings Report?

With the recent software mishap causing quite a stir, investors and analysts will closely monitor CrowdStrike’s following earnings report for signs of recovery. Last month, the company released its first quarter results for fiscal year 2025, which ended April 30, 2024, beating the analyst estimates for revenue and EPS.

The company reported a revenue of $921 million, a 33% year-over-year increase, against a guidance of $906 million. For the first quarter, CRWD’s earnings per share was $0.93, well above the consensus estimate of $0.89, and its revenue was higher than the analysts’ estimates by $16.21 million.

For the second quarter, the company forecasts revenue between $958.30 million and $961.20 million, with an EPS estimated at $0.98 to $0.99. Additionally, its non-GAAP income from operations is anticipated to be between $208.3 million and $210.5 million.

Meanwhile, Street expects CRWD’s EPS for the second quarter (ending July 2024) to increase 33.3% year-over-year to $0.99. Its revenue for the ongoing quarter is expected to reach $961.08 million, reflecting a 31.4% growth year-over-year. Moreover, the company surpassed consensus EPS estimates in each of the trailing four quarters, which is excellent.

For the fiscal year 2024, the company’s revenue and EPS are expected to grow 30.9% and 29.7% from the prior year to $4 billion and $4.01, respectively.

Bottom Line

Investors reacted strongly to the recent software mishap, causing CRWD’s stock to drop more than 11% on Friday. However, this reaction might be exaggerated. The incident, while disruptive, wasn’t a breach of CrowdStrike’s security and seems to be a one-off issue rather than a sign of ongoing vulnerabilities. As the company’s CEO, George Kurtz, pointed out, the problem stemmed from a botched update, causing significant but manageable disruptions. While resolving the issue could take several days, it’s unlikely to escalate further.

Analysts’ concerns are valid, but the company’s strong revenue growth and expanding opportunities “across Mexico, Brazil, and the broader Latin America market,” where CrowdStrike recently partnered to spread its Falcon platform, offer compelling reasons for optimism. The rise in e-crime in Latin America presents a significant revenue opportunity for CrowdStrike. Plus, S&P 500 inclusion means that plenty of index-fund holders will indirectly invest in CRWD.

The stock’s high forward non-GAAP P/E ratio of 65.84x, compared to the industry average of 24.89x, might deter some investors, but short-selling CrowdStrike could be risky. The stock will likely rebound and continue rising as long as the growth narrative holds.

Moreover, Mizuho analyst Jordan Klein even views the current dip as a chance to buy, likening it to a “one-time discount sale.” So, while the recent dip in CRWD might be unsettling, it presents a potential buying opportunity for long-term investors seeking quality exposure in the cybersecurity sector.

Broadcom (AVGO) and Micron (MU): Top Picks for Data Center Investment Surge

The expected record spending on infrastructure by cloud computing leaders such as Microsoft Corporation (MSFT) and Amazon.com, Inc. (AMZN) this year highlights the escalating investments in artificial intelligence (AI) data centers, a trend likely to benefit chipmakers significantly.

Bank of America (BofA) analysts forecast that cloud service provider capital expenditures will reach $121 billion in the second half of 2024, bringing the total to a record $227 billion in 2024. This figure marks a 39% increase compared to the previous year.

c, Microsoft, and Meta Platforms, Inc. (META) are predicted to more than double their spending compared to 2020 levels, while Oracle Corporation (ORCL) is expected to increase its capital expenditure nearly sixfold. The proportion of this spending allocated to data centers is already around 55% and is anticipated to rise further, reflecting the critical role of data centers in supporting advanced AI applications.

While NVIDIA Corporation (NVDA) stands out as the dominant player in the AI GPU market, BofA analysts have highlighted Broadcom Inc. (AVGO) and Micron Technology, Inc. (MU) as compelling alternatives for investors seeking to benefit from this trend.

In this article, we will delve into why Broadcom and Micron are well-positioned to capitalize on growing investments by cloud service providers in AI data centers, evaluate their financial health and recent performance, and explore the potential headwinds and tailwinds they may encounter in the near future.

Broadcom Inc. (AVGO)

Valued at a $732.45 billion market cap, Broadcom Inc. (AVGO) is a global tech leader that designs, develops, and supplies semiconductor and infrastructure software solutions. Broadcom’s extensive portfolio of semiconductor solutions, including networking chips, storage adapters, and advanced optical components, makes it a critical supplier for data centers.

Moreover, Broadcom’s leadership in networking solutions, exemplified by its Tomahawk and Trident series of Ethernet switches, positions it as a critical beneficiary of increased AI data center spending.

In May, AVGO revolutionized the data center ecosystem with its latest portfolio of highly scalable, high-performing, low-power 400G PCIe Gen 5.0 Ethernet adapters. The latest products provide an improved, open, standards-based Ethernet NIC and switching solution to address connectivity bottlenecks caused by the rapid growth in XPU bandwidth and cluster sizes in AI data centers.

Further, Broadcom’s strategic acquisitions, such as the recent purchase of VMware, Inc., enhance its data center and cloud computing capabilities. With this acquisition, AVGO will bring together its engineering-first, innovation-centric teams as it takes another significant step forward in building the world’s leading infrastructure technology company. 

Broadcom’s solid second-quarter performance was primarily driven by AI demand and VMware. AVGO’s net revenue increased 43% year-over-year to $12.49 billion in the quarter that ended May 5, 2024. That exceeded the consensus revenue estimate of $12.01 billion. Revenue from its AI products hit a record of $3.10 billion for the quarter.

AVGO reported triple-digit revenue growth in the Infrastructure Software segment to $5.29 billion as enterprises increasingly adopted the VMware software stack to build their private clouds. Its gross margin rose 27.2% year-over-year to $7.78 billion. Its non-GAAP operating income grew 32% from the year-ago value to $7.15 billion. Its adjusted EBITDA was $7.43 billion, up 30.6% year-over-year.

Further, the company’s non-GAAP net income was $5.39 billion or $10.96 per share, up 20.2% and 6.2% from the prior year’s quarter, respectively. Cash from operations of $4.58 billion for the quarter, less capital expenditures of $132 million, resulted in free cash flow of $4.45 billion, or 36% of revenue.

When it posted solid earnings for its second quarter, Broadcom announced a ten-for-one stock split, which took effect on July 12, making stock ownership more affordable and accessible to investors.

Moreover, AVGO raised its fiscal year 2024 guidance. The tech company expects full-year revenue of nearly $51 billion. Broadcom anticipates $10 billion in revenue from chips related to AI this year. Its adjusted EBITDA is expected to be approximately 61% of projected revenue.

Analysts expect AVGO’s revenue for the third quarter (ending July 2024) to grow 45.9% year-over-year to $12.95 billion. The consensus EPS estimate of $1.20 for the ongoing quarter indicates a 14% year-over-year increase. Also, the company has surpassed the consensus revenue and EPS estimates in each of the trailing four quarters.

In addition, the company’s revenue and EPS for the fiscal year ending October 2024 are expected to increase 43.6% and 12.4% from the previous year to $51.44 billion and $4.75, respectively.

AVGO’s shares have gained more than 29% over the past six months and around 74% over the past year. Moreover, the stock is up nearly 40% year-to-date.

Micron Technology, Inc. (MU)

Another chipmaker that is well-poised to benefit from significant data center spending among enterprises is Micron Technology, Inc. (MU). With a $126.70 billion market cap, MU provides cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s role as a leading provider of DRAM and NAND flash memory positions it to capitalize on the surging demand for high-performance memory solutions. The need for advanced memory products grows as data centers expand to support AI and machine learning workloads. The company’s innovation in memory technologies, such as the HBM2E, aligns well with the performance requirements of modern data centers.

Also, recently, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads. Notably, Micron reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Further, MU’s strategic partnerships with leading tech companies like Nvidia and Intel Corporation (INTC) position the chipmaker at the forefront of technology advancements. In February, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

For the third quarter, which ended May 30, 2024, MU posted revenue of $6.81 billion, surpassing analysts’ expectations of $6.67 billion. That compared to $5.82 billion in the prior quarter and $3.75 billion for the same period last year. Moreover, AI demand drove 50% sequential data center revenue growth and record-high data center revenue mix.

MU’s non-GAAP gross margin was $1.92 billion, versus $1.16 million in the prior quarter and negative $603 million for the previous year’s quarter. Its non-GAAP operating income came in at $941 million, compared to $204 million in the prior quarter and negative $1.47 billion for the same period in 2023.

Additionally, the chip company reported non-GAAP net income and earnings per share of $702 million and $0.62 for the third quarter, compared to non-GAAP net loss and loss per share of $1.57 billion and $1.43 a year ago, respectively. Its EPS beat the consensus estimate of $0.53. Its adjusted free cash flow was $425 million during the quarter, compared to a negative $1.36 billion in the prior year’s quarter.

For the fourth quarter of fiscal 2024, Micron expects non-GAAP revenue of $7.60 million ± $200 million, and its gross margin is anticipated to be 34.5% ± 1%. Also, the company expects its non-GAAP earnings per share to be $1.08 ± 0.08.

Analysts expect AVGO’s revenue for the fourth quarter (ending August 2024) to increase 91.4% year-over-year to $7.68 billion. The company is expected to report an EPS of $1.14 for the current quarter, compared to a loss per share of $1.07 in the prior year’s quarter. Further, the company has surpassed the consensus revenue and EPS estimates in each of the trailing four quarters.

MU’s shares have surged over 30% over the past six months and approximately 75% over the past year.

Bottom Line

The substantial surge in capital expenditures by cloud computing giants like Microsoft, Amazon, and Alphabet highlights the importance of AI and data centers in the tech industry’s landscape. Broadcom and Micron emerge as two of the most promising chip stocks for investors seeking to benefit from this trend. Both companies offer solid financial health, significant market positions, and exposure to the expanding data center and AI markets.

While Broadcom’s diverse semiconductor solutions and Micron’s leadership in memory technology make them attractive investment opportunities, investors must remain mindful of potential headwinds, including market competition and geopolitical risks. By evaluating these factors and understanding the growth potential of these companies, investors can make informed decisions in the rapidly evolving technology sector.

Nvidia’s GPUs a Game-Changer for Investors?

NVIDIA Corporation (NVDA), a tech giant advancing AI through its cutting-edge graphics processing units (GPUs), became the third U.S. company to exceed a staggering market capitalization of $3 trillion in June, after Microsoft Corporation (MSFT) and Apple Inc. (AAPL). This significant milestone marks nearly a doubling of its value since the start of the year. Nvidia’s stock has surged more than 159% year-to-date and around 176% over the past year.

What drives the company’s exceptional growth, and how do Nvidia GPUs translate into significant financial benefits for cloud providers and investors? This piece will explore the financial implications of investing in NVIDIA GPUs, the impressive ROI metrics for cloud providers, and the company’s growth prospects in the AI GPU market.

Financial Benefits of NVDA’s GPUs for Cloud Providers

During the Bank of America Securities 2024 Global Technology Conference, Ian Buck, Vice President and General Manager of NVDA’s hyperscale and HPC business, highlighted the substantial financial benefits for cloud providers by investing in NVIDIA GPUs.

Buck illustrated that for every dollar spent on NVIDIA GPUs, cloud providers can generate five dollars over four years. This return on investment (ROI) becomes even more impressive for inferencing tasks, where the profitability rises to seven dollars per dollar invested over the same period, with this figure continuing to increase.

This compelling ROI is driven by the superior performance and efficiency of Nvidia’s GPUs, which enable cloud providers to offer enhanced services and handle more complex workloads, particularly in the realm of AI. As AI applications expand across various industries, the demand for high-performance inference solutions escalates, further boosting cloud providers’ financial benefits utilizing NVIDIA’s technology.

NVDA’s Progress in AI and GPU Innovations

NVIDIA’s commitment to addressing the surging demand for AI inference is evident in its continuous innovation and product development. The company introduced cutting-edge products like NVIDIA Inference Microservices (NIMs), designed to support popular AI models such as Llama, Mistral, and Gemma.

These optimized inference microservices for deploying AI models at scale facilitate seamless integration of AI capabilities into cloud infrastructures, enhancing efficiency and scalability for cloud providers.

In addition to NIMs, NVDA is also focusing on its new Blackwell GPU, engineered particularly for inference tasks and energy efficiency. The upcoming Blackwell model is expected to ship to customers later this year. While there may be initial shortages, Nvidia remains optimistic. Buck noted that each new technology phase brings supply and demand challenges, as they experienced with the Hopper GPU.

Furthermore, the early collaboration with cloud providers on the forthcoming Rubin GPU, slated for a 2026 release, underscores the company’s strategic foresight in aligning its innovations with industry requirements.

Nvidia’s GPUs Boost its Stock Value and Earnings

The financial returns of investing in Nvidia GPUs benefit cloud providers considerably and have significant implications for NVDA’s stock value and earnings. With a $4 trillion market cap within sight, the chip giant’s trajectory suggests continued growth and potential for substantial returns for investors.

NVDA’s first-quarter 2025 earnings topped analysts’ expectations and exceeded the high bar set by investors, as Data Center sales rose to a record high amid booming AI demand. For the quarter that ended April 28, 2024, the company posted a record revenue of $26 billion, up 262% year-over-year. That compared to the consensus revenue estimate of $24.56 billion.

The chip giant’s quarterly Data Center revenue was $22.60 billion, an increase of 427% from the prior year’s quarter. Its non-GAAP operating income rose 492% year-over-year to $18.06 billion. NVIDIA’s non-GAAP net income grew 462% from the prior year’s quarter to $15.24 billion. In addition, its non-GAAP EPS came in at $6.12, up 461% year-over-year.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI. Spectrum-X opens a brand-new market for us to bring large-scale AI to Ethernet-only data centers. And NVIDIA NIM is our new software offering that delivers enterprise-grade, optimized generative AI to run on CUDA everywhere — from the cloud to on-prem data centers and RTX AI PCs — through our expansive network of ecosystem partners,” Huang added.

According to its outlook for the second quarter of fiscal 2025, Nvidia’s revenue is anticipated to be $28 billion, plus or minus 2%. The company expects its non-GAAP gross margins to be 75.5%. For the full year, gross margins are projected to be in the mid-70% range.

Analysts also appear highly bullish about the company’s upcoming earnings. NVDA’s revenue and EPS for the second quarter (ending July 2024) are expected to grow 110.5% and 135.5% year-over-year to $28.43 billion and $0.64, respectively. For the fiscal year ending January 2025, Street expects the chip company’s revenue and EPS to increase 97.3% and 111.1% year-over-year to $120.18 billion and $2.74, respectively.

Robust Future Growth in the AI Data Center Market

The exponential growth of AI use cases and applications across various sectors—ranging from healthcare and automobile to retail and manufacturing—highlights the critical role of GPUs in enabling these advancements. NVIDIA’s strategic investments in AI and GPU technology and its emphasis on collaboration with cloud providers position the company at the forefront of this burgeoning AI market.

As Nvidia’s high-end server GPUs are essential for training and deploying large AI models, tech giants like Microsoft and Meta Platforms, Inc. (META) have spent billions of dollars buying these chips. Meta CEO Mark Zuckerberg stated his company is “building an absolutely massive amount of infrastructure” that will include 350,000 H100 GPU graphics cards to be delivered by NVDA by the end of 2024.

NVIDIA’s GPUs are sought after by several other tech companies for superior performance, including Amazon, Microsoft Corporation (MSFT), Alphabet Inc. (GOOGL), and Tesla, Inc. (TSLA).

Notably, NVDA owns a 92% market share in data center GPUs. Led by Nvidia, U.S. tech companies dominate the burgeoning market for generative AI, with market shares of 70% to over 90% in chips and cloud services.

According to the Markets and Markets report, the data center GPU market is projected to value more than $63 billion by 2028, growing at an impressive CAGR of 34.6% during the forecast period (2024-2028). The rapidly rising adoption of data center GPUs across cloud providers should bode well for Nvidia.

Bottom Line

NVDA’s GPUs represent a game-changer for both cloud providers and investors, driven by superior performance and a compelling return on investment (ROI). The attractive financial benefits of investing in NVIDIA GPUs underscore their value, with cloud providers generating substantial profits from enhanced AI capabilities. This high ROI, particularly in AI inferencing tasks, positions Nvidia as a pivotal player in the burgeoning AI data center market, reinforcing its dominant market share and driving continued growth.

Moreover, Wall Street analysts remain bullish about this AI chipmaker’s prospects. TD Cowen analyst Matthew Ramsay increased his price target on NVDA stock from $140 to $165, while maintaining the Buy rating. “One thing remains the same: fundamental strength at Nvidia,” Ramsay said in a client note. “In fact, our checks continue to point to upside in data center (sales) as demand for Hopper/Blackwell-based AI systems continues to exceed supply.”

“Overall we see a product roadmap indicating a relentless pace of innovation across all aspects of the AI compute stack,” Ramsay added.

Meanwhile, KeyBanc Capital Markets analyst John Vinh reiterated his Overweight rating on NVIDIA stock with a price target of $180. “We expect Nvidia to deliver higher results and higher guidance” with its second-quarter 2025 report, Vinh said in a client note. He added solid demand for generative AI will drive the upside.

As AI applications expand across various key industries, NVIDIA’s continuous strategic innovations and product developments, such as the Blackwell GPU and NVIDIA Inference Microservices, ensure the company remains at the forefront of technological advancement. With a market cap nearing $4 trillion and a solid financial outlook, NVIDIA is well-poised to deliver substantial returns for investors, solidifying its standing as a leader in the AI and GPU technology sectors.

Intel's $8.5 Billion Gamble: Can It Rival Nvidia?

Intel Corporation (INTC), a leading player in the semiconductor industry, is making headlines with its ambitious plans to transform its operations, spurred by a substantial $8.5 billion boost from the CHIPS and Science Act. The roughly $280 billion legislative package, signed into law by President Joe Biden in 2022, aims to bolster U.S. semiconductor manufacturing and research and development (R&D) capabilities.

CHIPS Act funding will help advance Intel’s commercial semiconductor projects at key sites in Arizona, New Mexico, Ohio, and Oregon. Also, the company expects to benefit from a U.S. Treasury Department Investment Tax Credit (ITC) of up to 25% on over $100 billion in qualified investments and eligibility for federal loans up to $11 billion.

Previously, CHIPS Act funding and INTC announced plans to invest more than $1100 billion in the U.S. over five years to expand chipmaking capacity critical to national security and the advancement of cutting-edge technologies, including artificial intelligence (AI).

Notably, Intel is the sole American company that both designs and manufactures leading-edge logic chips. Its strategy focuses on three pillars: achieving process technology leadership, constructing a more resilient and sustainable global semiconductor supply chain, and developing a world-class foundry business. These goals align with the CHIPS Act’s objectives to restore manufacturing and technological leadership to the U.S.

The federal funding represents a pivotal opportunity for INTC to reclaim its position as a chip manufacturing powerhouse, potentially rivaling giants like NVIDIA Corporation (NVDA) and Advanced Micro Devices, Inc. (AMD).

Intel’s Strategic Initiatives to Capitalize on AI Boom

At Computex 2024, INTC introduced cutting-edge technologies and architectures that are well-poised to significantly accelerate the AI ecosystem, from the data center, cloud, and network to the edge and PC.

The company launched Intel® Xeon® 6 processors with E-core (Efficient-core) and P-core (Performance-core) SKUs, delivering enhanced performance and power efficiency for high-density, scale-out workloads in the data center. The first of the Xeon 6 processors debuted is the Intel Xeon 6 E-core (code-named Sierra Forest), available beginning June 4. Further, Xeon 6 P-cores (code-named Granite Rapids) are expected to launch next quarter.

Beyond the data center, Intel is expanding its AI footprint in edge computing and PCs. With over 90,000 edge deployments and 200 million CPUs distributed across the ecosystem, the company has consistently enabled enterprise choice for many years. INTC revealed the architectural details of Lunar Lake, the flagship processor for the next generation of AI PCs.

Lunar Lake is set to make a significant leap in graphics and AI processing capabilities, emphasizing power-efficient compute performance tailored for the thin-and-light segment. It promises up to a 40% reduction in System-on-Chip (SoC) power3 and over three times the AI compute8. It is scheduled for release in the third quarter of 2024, in time for the holiday shopping season.

Also, Intel unveiled pricing for Intel® Gaudi® 2 and Intel® Gaudi® 3 AI accelerator kits, providing high performance at up to one-third lower cost compared to competitive platforms. A standard AI kit, including Intel Gaudi 2 accelerators with a UBB, is offered to system providers at $65,000. Integrating Xeon processors with Gaudi AI accelerators in a system presents a robust solution to make AI faster, cheaper, and more accessible.

Intel CEO Pat Gelsinger said, “Intel is one of the only companies in the world innovating across the full spectrum of the AI market opportunity – from semiconductor manufacturing to PC, network, edge and data center systems. Our latest Xeon, Gaudi and Core Ultra platforms, combined with the power of our hardware and software ecosystem, are delivering the flexible, secure, sustainable and cost-effective solutions our customers need to maximize the immense opportunities ahead.”

On May 1, INTC achieved a significant milestone of surpassing 500 AI models running optimized on new Intel® Core™ Ultra processors due to the company’s investment in client AI, the AI PC transformation, framework optimizations, and AI tools like OpenVINO™ toolkit. These processors are the industry’s leading AI PC processors, offering enhanced AI experiences, immersive graphics, and optimized battery life.

Solid First-Quarter Performance and Second-Quarter Guidance

During the first quarter that ended March 30, 2024, INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Revenue from the Client Computing Group (CCG), through which Intel continues to advance its mission to bring AI everywhere, rose 31% year-over-year to $7.50 billion.

Furthermore, the company’s non-GAAP operating income was $723 million, compared to an operating loss of $294 million in the previous year’s quarter. Its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18, compared to a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter of 2023.

For the second quarter of fiscal 2024, Intel expects its revenue to come between $12.5 billion and $13.5 billion, and its non-GAAP earnings per share is expected to be $0.10.

Despite its outstanding financial performance and ambitious plans, INTC’s stock has plunged more than 38% over the past six months and nearly 40% year-to-date.

Competing with Nvidia: A Daunting Task

Despite INTC’s solid financial health and strategic moves, the competition with NVDA is fierce. Nvidia’s market performance has been stellar lately, driven by its global leadership in graphics processing units (GPUs) and its foray into AI and machine learning markets. The chip giant has built strong brand loyalty among developers and enterprise customers, which could be challenging for Intel to overcome.

Over the past year, NVIDIA has experienced a significant surge in sales due to high demand from tech giants such as c, Alphabet Inc. (GOOGL), Microsoft Corporation (MSFT), Meta Platforms, Inc. (META), and OpenAI, who invested billions of dollars in its advanced GPUs essential for developing and deploying AI applications.

Shares of the prominent chipmaker surged approximately 150% over the past six months and more than 196% over the past year. Moreover, NVDA’s stock is up around 2,938% over the past five years. Notably, after Amazon and Google, Nvidia recently became the third U.S. company with a market value surpassing $3 trillion.

As a result, NVDA commands a dominant market share of about 92% in the data center GPU market. Nvidia’s success stems from its cutting-edge semiconductor performance and software prowess. The CUDA development platform, launched in 2006, has emerged as a pivotal tool for AI development, with a user base exceeding 4 million developers.

Bottom Line

Proposed funding of $8.5 billion, along with an investment tax credit and eligibility for CHIPS Act loans, are pivotal in Intel’s bid to regain semiconductor leadership in the face of intense competition, particularly from Nvidia. This substantial federal funding will enhance Intel’s manufacturing and R&D capabilities across its key sites in Arizona, New Mexico, Ohio, and Oregon.

While INTC possesses the resources, technological expertise, and strategic vision to challenge NVDA, the path forward is fraught with challenges. Despite Intel’s recent strides in the AI ecosystem, from the data center to edge and PC with products like Xeon 6 processors and Gaudi AI accelerators, Nvidia’s dominance in data center GPUs remains pronounced, commanding a significant market share.

Future success will depend on Intel’s ability to leverage its strengths in manufacturing, introducing innovative product lines, and cultivating a compelling ecosystem of software and developer support. As Intel advances its ambitious plans, industry experts and stakeholders will keenly watch how these developments unfold, redefining the competitive landscape in the AI and data center markets.

How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.