How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.

Intel's AI Ambitions: A Strategic Shift Toward Private Data Storage Solutions

Intel Corporation (INTC), a titan in the world of semiconductors, is navigating a period of transformative change that is revolutionizing its corporate culture and product development. Traditionally, Intel’s core offerings have been microprocessors that serve as the brains of desktop PCs, laptops and tablets, and servers. These processors are silicon wafers embedded with millions or billions of transistors, each acting as binary switches that form the fundamental ‘ones and zeros’ of computer operations.

Today, the thirst for enhanced processing power is insatiable. The proliferation of Artificial Intelligence (AI), which has become integral to essential business operations across almost every sector, exponentially increases the need for robust computing capabilities. AI, particularly neural networks, necessitates enormous computing power and thrives on the collaborative efforts of multiple computing systems. The scope of these AI applications extends far beyond the PCs and servers that initially cemented INTC’s status as an industry leader.

The rapid advancement of AI has prompted Intel to rethink and innovate its chip designs and functionalities. As a result, the company is developing new software and designing interoperable chips while exploring external partnerships to accelerate its adaptation to the evolving computing environment.

Strategic Pivot Toward AI Ecosystem

At Computex 2024, INTC unveiled a series of groundbreaking AI-related announcements, showcasing the latest technologies that merge cutting-edge performance with power efficiency (especially in data centers and for AI on personal computers). The company aims to make AI cheaper and more accessible for everyone.

Intel CEO Pat Gelsinger emphasized how AI is changing the game, stating, “The magic of silicon is once again enabling exponential advancements in computing that will push the boundaries of human potential and power the global economy for years to come.”

In just six months, Intel achieved a lot, transitioning from launching 5th Gen Intel® Xeon® processors to introducing the pioneering Xeon 6 series. The company also previewed Gaudi AI accelerators, offering enterprise clients a cost-effective GenAI training and inference system. Furthermore, Intel has spearheaded the AI PC revolution by integrating Intel® Core™ Ultra processors in over 8 million devices while teasing the upcoming client architecture slated for release later this year.

These strides underscore Intel's commitment to accelerating execution and driving innovation at an unprecedented pace to democratize AI and catalyze industries.

Strategic Pricing and Availability of Its Gaudi AI Accelerators

Intel is gearing up to launch the third generation of its Gaudi AI accelerators later this year, aiming to address a backlog of around $2 billion related to AI chips. However, the company anticipates generating only about $500 million in Gaudi 3 sales in 2024, possibly due to supply constraints.

To broaden the availability of Gaudi 3 systems, Intel is expanding its network of system providers. The company is now collaborating with Asus, Foxconn, Gigabyte, Inventec, Quanta, and Wistron alongside existing partners like Dell Technologies Inc. (DELL), Hewlett Packard Enterprise Co (HPE), Lenovo Group (LNVGY), and Super Micro Computer, Inc. (SMCI), to ensure Gaudi 3 systems are available far and wide once they hit the market.

But what caught attention at Intel's announcement was the company's attractive pricing strategy. Kits featuring eight Gaudi 2 AI chips and a universal baseboard will cost $65,000, while the version with eight Gaudi 3 AI chips will be priced at $125,000. These prices are estimated to be one-third and two-thirds of the cost of comparable competitive platforms, respectively.

While undercutting Nvidia Corporation (NVDA) on price, INTC expects its chips to deliver impressive performance. According to their estimates, a cluster of 8,192 Gaudi 3 chips can train AI models up to 40% faster than NVDA's H100 chips. Additionally, Gaudi 3 offers up to double the AI inferencing performance of the H100 when running popular large language models (LLMs).

Intel Continues to Ride with 500+ Optimized Models on Core Ultra Processors

In May, INTC announced that over 500 AI models now run optimized on new Intel® Core™ Ultra processors. These processors, known for their advanced AI capabilities, immersive graphics, and optimal battery life, mark a significant milestone in Intel's AI PC transformation efforts.

This achievement stems from Intel's investments in client AI, framework optimizations, and tools like the OpenVINO™ toolkit. The 500+ AI models cover various applications, including large language models, super-resolution, object detection, and computer vision, and are available across popular industry platforms.

The Intel Core Ultra processor is the fastest-growing AI PC processor and the most robust platform for AI PC development. It supports a wide range of AI models, frameworks, and runtimes, making it ideal for AI-enhanced software features like object removal and image super-resolution. This milestone underscores Intel's commitment to advancing AI PC technology, offering users a broad range of AI-driven functionalities for enhanced computing experiences.

Robust Financial Performance and Outlook

Buoyed by solid innovation across its client, edge, and data center portfolios, the company delivered a solid financial performance, driving double-digit revenue growth in its products. Total Intel Products chalked up $11.90 billion in revenue for the first quarter of 2024 (ended March 30), resulting in a 17% year-over-year increase over the prior year’s period. Revenue from the Client Computing Group (CCG) rose 31% year-over-year.

INTC’s net revenue increased 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. Intel’s Data Center and AI (DCAI) division, which offers server chips, saw sales uptick 5% to $3.04 billion.

Also, the company reported a non-GAAP operating income of $723 million, compared to an operating loss of $294 million in the prior year’s quarter. Further, its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18 versus a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter last year.

For the second quarter, Intel expects its revenue to come between $12.5 billion and $13.5 billion, while its non-GAAP earnings per share is expected to be $0.10.

Bottom Line

Despite vital innovations and solid financial performance, INTC’s shares have lost nearly 40% year-to-date and more than 3% over the past 12 months. However, with over 5 million AI PCs shipped since the December 2023 launch of Intel Core Ultra processors, supported by over 100 software vendors, the company expects to exceed its forecast of 40 million AI PCs by the end of 2024.

With the growing demand for AI chips, INTC could see a significant increase in Gaudi chip sales next year as customers look for cost-effective alternatives to NVDA's market-leading products. Moreover, if Intel's reasonable pricing resonates with prospective customers, the company could capture significant market share from its competitors.

Why Super Micro Computer (SMCI) Could Be a Hidden Gem for Growth Investors

In March 2024, Super Micro Computer, Inc. (SMCI) became the latest artificial intelligence (AI) company to join the S&P 500 index, just a little more than a year after joining the S&P MidCap 400 in December 2022. Shares of SMCI jumped by more than 2,000% in the past two years, driven by robust demand for its AI computing products, which led to rapid sales growth.

Moreover, SMCI’s stock has surged nearly 205% over the past six months and more than 520% over the past year. A historic rally in the stock has pushed the company’s market cap past $48 billion.

SMCI is a leading manufacturer of IT solutions and computing products, including storage and servers tailored for enterprise and cloud data centers, purpose-built for use cases such as AI, cloud computing, big data, and 5G applications. The company has significantly benefited from the ongoing AI boom in the technology sector.

According to ResearchAndMarkets.com’s report, the global AI server market is expected to reach $50.65 billion by 2029, growing at a CAGR of 26.5% during the forecast period (2024-2029).

Specializing in servers and computer infrastructure, SMCI maintains long-term alliances with major tech companies, including Nvidia Corporation (NVDA), Intel Corporation (INTC), and Advanced Micro Devices, Inc. (AMD), which have fueled the company’s profitability and growth.

Let’s discuss Super Micro Computer’s fundamentals and growth prospects in detail:

Recent Strategic Developments

On April 9, SMCI announced its X14 server portfolio with future support for the Intel® Xeon® 6 processor with early access programs. Supermicro’s Building Block Architecture, rack plug-and-play, and liquid cooling solutions, along with the breadth of the new Intel Xeon 6 processor family, enables the delivery of optimized solutions for any workload and at any scale, offering superior performance and efficiency.

The upcoming processor family will be available with Efficient-core (E-core) SKUs rising performance-per-watt for cloud, networking, analytics, and scale-out workloads, and Performance-core (P-core) SKUs increasing performance-per-core for AI, HPC, Storage and Edge workloads. 

Also, the upcoming processor portfolio will feature built-in Intel Accelerator Engines with new support for FP16 on Intel Advanced Matrix Extensions.

In the same month, SMCI expanded its edge compute portfolio to accelerate IoT and edge AI workloads with a new generation of embedded solutions.

“We continue to expand our system product line, which now includes servers that are optimized for the edge and can handle the demanding workloads where massive amounts of data are generated,” said Charles Liang, president and CEO of SMCI.

“Our building block architecture allows us to design and deliver a wide range of AI servers that give enterprises the solutions they need, from the edge to the cloud. Our new Intel Atom-based edge systems contain up to 16GB of memory, dual 2.5 GbE LAN ports, and a NANO SIM card slot, which enables AI inferencing at the edge where most of the world's data is generated,” Liang added.

Also, on March 19, Supermicro unveiled its newest lineup aimed at accelerating the deployment of generative AI. The Supermicro SuperCluster solutions offer foundational building blocks for the present and the future large language model (LLM) infrastructure.

The full-stack SuperClusters include air- and liquid-cooled training and cloud-scale inference rack configurations with the latest NVIDIA Tensor Core GPUs, Networking, and NVIDIA AI Enterprise software.

Further, SMCI announced new AI systems for large-scale generative AI featuring NVIDIA's next-generation of data center products, such as the latest NVIDIA GB200 Grace™ Blackwell Superchip, the NVIDIA B200 Tensor Core, and B100 Tensor Core GPUs.

Supermicro is upgrading its existing NVIDIA HGX™ H100/H200 8-GPU systems for seamless integration with the NVIDIA HGX™ B100 8-GPU, thus reducing time to delivery. Also, the company strengthens its broad NVIDIA MGX™ systems range with new offerings featuring the NVIDIA GB200, including the NVIDIA GB200 NVL72, a comprehensive rack-level solution equipped with 72 NVIDIA Blackwell GPUs.

Additionally, Supermicro is introducing new systems to its portfolio, including the 4U NVIDIA HGX B200 8-GPU liquid-cooled system.

Solid Third-Quarter 2024 Results

For the third quarter that ended March 31, 2024, SMCI’s revenue increased 200.8% year-over-year to $3.85 billion. Its non-GAAP gross profit grew 163.9% from the year-ago value to $600.59 million. Its non-GAAP income from operations was $434.42 million, up 290.7% year-over-year.

The server assembler’s non-GAAP net income rose 340% from the prior year’s quarter to $411.54 million. Its non-GAAP net income per common share came in at $6.65, an increase of 308% year-over-year.

As of March 31, 2024, Super Micro Computer’s cash and cash equivalents stood at $2.12 billion, compared to $440.46 million as of June 30, 2023. The company’s total current assets were $8.06 billion versus $3.18 billion as of June 30, 2023.

Charles Liang, President and CEO of Supermicro, said, “Strong demand for AI rack scale PnP solutions, along with our team’s ability to develop innovative DLC designs, enabled us to expand our market leadership in AI infrastructure. As new solutions ramp, including fully production ready DLC, we expect to continue gaining market share.”

Raised Full-Year Revenue Outlook

SMCI expects net sales of $5.10 billion to $5.50 billion for the fourth quarter of fiscal year 2024 ending June 30, 2024. The company’s non-GAAP net income per share is anticipated to be between $7.62 and $8.42.

For the fiscal year 2024, Supermicro raised its guidance for revenues from a range of $14.30 billion to $14.70 billion to a range of $14.70 billion to $15.10 billion. Its non-GAAP net income per share is expected to be from $23.29 to $24.09.

CEO Charles Liang said he expects AI growth to remain solid for several quarters, if not years, to come. To support this rapid growth, the company had to raise capital through a secondary offering this year, Liang added.

Meanwhile, finance chief David Weigand said that the company’s supply chain continues to improve.

Bottom Line

SMCI’s fiscal 2024 third-quarter results were exceptional, with a record revenue of $3.85 billion and a non-GAAP EPS of $6.65. This year-over-year revenue growth of 200% and year-over-year non-GAAP EPS growth of 308% significantly outpaced its industry peers.

After reporting outstanding financial performance, the company raised its full-year revenue forecast as it points to solid AI demand.

Super Micro Computer, which joined the S&P 500 in March, has a unique edge among server manufacturers aiming to capitalize on the generative AI boom. Notably, the server maker’s close ties with Nvidia allow it to launch products superior to competitors, including Dell Technologies Inc. (DELL) and Hewlett Packard Enterprise Company (HPE).

The company has a history of being among the first to receive AI chips from NVDA and AMD as it assists them in checking server prototypes, giving it a head start over rivals. This has positioned SMCI as a key supplier of servers crucial for generative AI applications, leading to a remarkable 192% surge in shares so far this year.

According to an analyst at Rosenblatt Securities, Hans Mosesmann, “Super Micro has developed a model that is very, very quick to market. They usually have the widest portfolio of products when a new product comes out from Nvidia or AMD or Intel.”

Moreover, analysts at Bank of America project that SMCI’s share of the AI server market will expand to around 17% in 2026 from 10% in 2023. Argus analyst Jim Kelleher also seems bullish about SMCI. Kelleher maintained a Buy rating on SMCI’s stock.

According to the analyst, Super Micro Computer is a leading server provider for the era of generative AI. Alongside a comprehensive range of rack and blade servers for cloud, enterprise, data center, and other applications, SMCI offers GPU-based systems for deep learning, high-performance computing, and various other applications.

Given solid financials, accelerating profitability, and robust near-term growth outlook, investors could consider buying this stock for substantial gains.

Is Intel (INTC) a Buy, Sell, or Hold Amidst Tough Competition?

Intel Corporation (INTC), a prominent semiconductor company, is currently navigating a challenging phase characterized by a dwindling financial outlook and difficulties sustaining competitiveness within the semiconductor industry. Intel stands behind many tech stocks in the S&P 500 this year, while rival chipmaker NVIDIA Corporation (NVDA) emerges as the third-best performer in the index.

Now, we will evaluate the risks and opportunities associated with investing in Intel amidst competitive pressures.

Strategic Initiatives to Keep up With the Fierce Competition

Amid escalating competition in the tech arena, INTC, the foremost producer of processors driving PCs and laptops, has aggressively expanded its presence in the AI domain to remain abreast of its peers.

Last month, the company announced the creation of the world's largest neuromorphic system, dubbed Hala Point, which is powered by Intel's Loihi 2 processor. Initially deployed at Sandia National Laboratories, this system supports research for future brain-inspired AI and addresses challenges concerning AI efficiency and sustainability.

On April 9, Intel also unveiled a new AI chip called Gaudi 3, which was intended to compete against NVDA’s dominance in popular graphics processing units. The new chip boasts over twice the power efficiency and can run AI models one-and-a-half times faster than NVDA’s H100 GPU. The company expects more than $500 million in sales from its Gaudi 3 chips in the year's second half.

In March, Reuters reported that INTC plans to spend $100 billion across four U.S. states to build and expand factories, bolstered by $19.5 billion in federal grants and loans (with an additional $25 billion in tax incentives in sight). CEO Pat Gelsinger envisions transforming vacant land near Columbus, Ohio, into "the largest AI chip manufacturing site globally" by 2027, forming the cornerstone of Intel's ambitious five-year spending plan.

Such advancements enable the company to stay competitive and meet the growing demand for AI-driven solutions across various industries.

Solid First-Quarter Performance but Shaky Outlook

For the first quarter that ended March 30, 2024, INTC’s net revenue surged 8.6% year-over-year to $12.72 billion, primarily driven by growth in its personal computing, data center, and AI business. However, its revenue from the Foundry unit amounted to $4.40 billion, down about 10% year-over-year.

Intel’s gross margin grew 30.2% from the prior year’s quarter to $5.22 billion. Also, it reported a non-GAAP operating income of $723 million, compared to an operating loss of $294 million in 2023. Further, its non-GAAP net income and non-GAAP earnings per share came in at $759 million and $0.18 versus a net loss and loss per share of $169 million and $0.04, respectively, in the same quarter last year.

The solid financial performance underscores the vital innovation across its client, edge, and data center portfolios, driving double-digit product revenue growth. Total Intel Products chalked up $11.90 billion in revenue for the first quarter of 2024, resulting in a 17% year-over-year increase over the prior year’s period. Its Client Computing Group (CCG) contributed to about 31% of the gains of this unit.

However, the company lowered its outlook for the second quarter of 2024. The company expects its revenue to come between $12.5 billion and $13.5 billion, while its non-GAAP earnings per share is expected to be $0.10.

Following the company's weak guidance for the ongoing quarter, Intel shares nosedived as much as 13% on Friday morning, overshadowing its first-quarter earnings beat. Also, the stock has plunged nearly 15% over the past six months and more than 39% year-to-date.

Bottom Line

INTC surpassed analyst estimates on the top and bottom lines in the first quarter of 2024, but achieving full recovery appears challenging. The chipmaker provided a weak outlook for the second quarter, validating concerns about its ongoing struggle to capitalize on the AI boom amid competition pressures.

Looking ahead, analysts expect INTC’s revenue to increase marginally year-over-year to $13.09 billion for the quarter ending June 2024. However, the company’s EPS for the current quarter is expected to fall 16.2% from the prior year’s period to $0.11.

For the fiscal year 2024, the consensus revenue and EPS estimates of $56.06 billion and $1.10 indicate increases of 3.4% and 5.2% year-over-year, respectively.

Recently, Goldman Sachs analysts slashed their price target for Intel stock by $5 to $34 per share and reaffirmed a ‘Sell’ rating in light of heightened competition in the artificial intelligence landscape.

Toshiya Hari noted that the company’s weak guidance was due to delayed recovery in traditional server demand, driven by cloud and enterprise customers' focus on AI infrastructure spending. As a result, it could lead INTC to lose market share to competitors like NVDA and Arm Holdings plc (ARM) in the data center computing market.

Moreover, analysts at Bank of America decreased their price target on the stock from $44 to $40, citing rising costs, slower growth prospects, and intensified competition.

Additionally, INTC’s elevated valuation exacerbates market sensitivity. In terms of forward non-GAAP P/E, the stock trades at 27.58x, 18.9% above the industry average of 23.19x. Furthermore, its forward EV/Sales of 2.93x is 5.7% higher than the industry average of 2.77x. And the stock’s forward EV/EBIT of 31.80x compares to the industry average of 19.07x.

Also, the stock’s trailing-12-month gross profit and EBIT margins of 41.49% and 1.29% are 14.7% and 73.1% lower than the industry averages of 48.64% and 4.80%, respectively. Likewise, its asset turnover ratio of negative 0.29x compares to the industry average of 0.61x.

Given this backdrop, while we wouldn’t recommend investing in INTC now, keeping a close eye on the stock seems prudent.

How Investors Can Seize Opportunities in NVDA Amid Market Volatility

According to Todd Gordon, the founder of Inside Edge Capital, NVIDIA Corporation (NVDA) is a strong buy despite a recent pullback. The chart analyst also set a target price of $1,150 for the stock.

“I say that NVDA is just resting its legs gearing up for another move, but this time it's bringing more friends along for the run. There are quite a few different names in the semi-industry setup in a similar fashion telling me that once again the chips are ready to rip,” Gordon said.

Moreover, on March 13, Bank of America maintained its buy rating on NVDA and raised its price target from $925 to $1,100. As per BofA analyst Vivek Arya, Nvidia is expected to dominate the $90 billion accelerator market in 2024, unaffected by Google’s new CPU launch.

Last month, CNBC’s Jim Cramer suggested investors welcome an impending pullback. “I think people are right to expect a pullback here,” Cramer said. “But that’s not a reason to head for the hills. Instead, you want to raise a little cash, watch the market broaden — as it is doing — and then buy your favorite tech stocks when they come down.”

In Particular, Cramer said there may be an attractive opportunity to invest in one of his favorite stocks, NVDA. He hinted at his continued support for the tech giant over the years, even when the stock witnessed significant losses. While some on Wall Street might be growing weary of AI, Cramer emphasized that the future “runs on Nvidia.”

“If you don’t own Nvidia already, you know what? You’re about to get a sale,” he stated. “And if you do own it already, just stick with it, because it’s way too hard to swap out and then swap back in at the right level.”

Shares of NVDA have surged more than 75% year-to-date and nearly 223% over the past year. However, the stock has plunged around 3% over the past month.

Now, let’s discuss in detail factors that could influence NVDA’s performance in the near term:

Fourth-Quarter Beat on Revenue and Earnings

The chip giant reported fourth-quarter 2024 earnings that beat analysts’ expectations. For the quarter that ended January 28, 2024, NVDA’s non-GAAP revenue came in at $22.10 billion, surpassing analysts’ estimate of $20.55 billion. This compared to revenue of $6.05 billion in the same quarter of 2022.

The company posted a record revenue from the Data Center segment of $18.4 billion, up 409% from the year-ago value. NVIDIA achieved significant progress in this business segment. In collaboration with Google, NVDA launched optimizations across its data center and PC AI platforms for Gemma, Google’s groundbreaking open language models.

Further, the company expanded its partnership with Amazon Web Services (AWS) to host NVIDIA® DGX™ Cloud on AWS.

Regarding technological innovations, NVIDIA introduced several groundbreaking solutions, including NVIDIA NeMo™ Retriever. It is a generative AI microservice that enables enterprises to connect custom large language models with enterprise data, delivering highly accurate responses for various AI applications.

Additionally, NVIDIA launched NVIDIA MONAI™ cloud APIs, facilitating the seamless integration of AI into medical-imaging offerings for developers and platform providers.

The company’s Gaming revenue for the quarter was $2.90 billion, up 56% year-over-year. Talking about recent developments in the Gaming division, NVIDIA launched GeForce RTX™ 40 SUPER Series GPUs, starting at $599, featuring advanced RTX™ technologies such as DLSS 3.5 Ray Reconstruction and NVIDIA Reflex for enhanced gaming experiences.

The company also introduced microservices for the NVIDIA Avatar Cloud Engine, enabling game and application developers to integrate state-of-the-art generative AI models into non-playable characters, enhancing immersion and interactivity in virtual worlds.

NVIDIA’s non-GAAP operating income increased 563.2% year-over-year to $14.75 billion. Also, the company’s non-GAAP net income grew 490.6% from the previous year’s period to $12.84 billion. It reported non-GAAP earnings per share of $5.16, compared to the consensus estimate of $4.63, and up 486% year-over-year.

Furthermore, the company’s non-GAAP free cash flow was $11.22 billion, an increase of 546.1% from the previous year’s quarter. Its total current assets stood at $44.35 billion as of January 28, 2024, compared to $23.07 billion as of January 29, 2023.

During a call with analysts, Nvidia CEO Jensen Huang addressed investor concerns regarding the company's ability to sustain its current growth or sales levels throughout the year.

“Fundamentally, the conditions are excellent for continued growth” in 2025 and beyond, Huang told analysts. He added that the continued demand for the company’s GPUs would persist, driven by the adoption of generative AI and an industry-wide shift from central processors to Nvidia's accelerators.

For the first quarter of fiscal 2025, NVIDIA expects revenue of $24 billion. The company’s non-GAAP gross margin is expected to be 77%.

Recent Announcement of AI Chips During Nvidia GTC AI Conference

NVDA announced a new generation of AI chips and software tailored for running AI models during its developer's conference at SAP Center on March 18 in San Jose, California. This announcement underscores the chipmaker’s efforts to solidify its position as the go-to supplier for AI companies.

The new generation of AI graphics processors is named Blackwell. The first Blackwell chip is the GB200 and is anticipated to ship later this year. It will also be available as an entire server called the GB200 NVLink 2, combining 72 Blackwell GPUs and other Nvidia parts designed to train AI models. NVIDIA is enticing customers by offering more powerful chips to spur new orders.

The announcement comes as companies and software makers still scramble to get their hands on the current “Hopper” H100s and similar chips.

“Hopper is fantastic, but we need bigger GPUs,” Nvidia CEO Jensen Huang said at the company’s developer conference.

Further, the tech giant unveiled revenue-generating software called NIM, which stands for Nvidia Inference Microservices, to its Nvidia enterprise software subscription. NIM simplifies using older Nvidia GPUs for inference or running AI software and will enable companies to leverage the hundreds of millions of Nvidia GPUs they already own.

According to Nvidia executives, the company is transitioning from primarily being a mercenary chip provider to becoming more of a platform provider, like Microsoft Corporation (MSFT) or Apple Inc. (AAPL), on which other firms can build software.

Analysts at Goldman Sachs retained a buy rating of NVDA stock and raised their price target to $1,000 from $875. They expressed “renewed appreciation” for Nvidia’s innovation, customer and partner relationships, and vital role in the generative AI space after the company’s keynote.

“Based on our recent industry conversations, we expect Blackwell to be the fastest ramping product in Nvidia’s history,” the analysts said. “Nvidia has played (and will continue to play) an instrumental role in democratizing AI across many industry verticals.”

Bottom Line

NVDA surpassed Wall Street’s estimates for earnings and sales in the fourth quarter of fiscal 2023. The chipmaker has significantly benefited from the recent technology industry obsession with large AI models, which are developed on its pricey graphics processors for servers.

Moreover, sales reported in the company’s Data Center business comprise most of its revenue. NVDA’s Data Center platform is driven by diverse drivers like demand for data processing, training and inference from large cloud-service providers, GPU-specialized ones, enterprise software, and consumer internet companies.

Further, vertical industries, led by automotive, financial services, and healthcare, are now at a multibillion-dollar level.

The data center GPU market is projected to be worth more than $63 billion by 2028, growing at a staggering CAGR of 34.6% during the forecast period (2024-2028). The increasing adoption of data center GPUs in enterprises should bode well for NVDA.

Analysts expect NVDA’s revenue and EPS for the fiscal 2025 first quarter (ending April 2024) to increase 237.7% and 405.9% year-over-year to $24.29 billion and $5.51, respectively. Moreover, the company has topped consensus revenue and EPS estimates in all four trailing quarters, which is remarkable.

Furthermore, for the fiscal year ending January 2025, the company’s revenue and EPS are expected to grow 83% and 92.1% from the prior year to $111.49 billion and $24.89, respectively.

NVDA has achieved significant progress across its business divisions, and this year, it will bring new product cycles with exceptional innovations to help boost its industry forward.

Since the AI boom began in late 2022, catalyzed by OpenAI’s ChatGPT, Nvidia’s stock has been up fivefold, and its total sales have more than tripled. The company’s high-end server GPUs are essential for training and deploying large AI models. Notably, tech companies like MSFT and Meta Platforms, Inc. (META) have spent billions of dollars buying these chips.

Recently, the chipmaker announced a new generation of AI chips and software for running AI models, giving customers another reason to stick to Nvidia chips over a growing field of competitors, including Advanced Micro Devices, Inc. (AMD) and Intel Corporation (INTC).

While NVDA’s stock has declined nearly 3% over the past month, several analysts affirmed their bullish sentiment toward the stock and see a significant upside potential, owing to its booming AI business and new innovative launches to maintain its leading position in the face of rising competition.

Given these factors, investors could consider buying NVDA for potential gains.