Beyond Nvidia: Three AI Infrastructure Plays Positioned to Outperform by 2028

The Motley FoolThe Motley Fool
|||6 min read
Key Takeaway

Broadcom, AMD, and Micron positioned to outperform Nvidia in AI infrastructure by 2028 through custom chips, CPU strength, and memory demand.

Beyond Nvidia: Three AI Infrastructure Plays Positioned to Outperform by 2028

Beyond Nvidia: Three AI Infrastructure Plays Positioned to Outperform by 2028

$BROADCOM, $AMD, and $MU are emerging as compelling alternatives to $NVDA in the artificial intelligence infrastructure race, with analysts projecting these chipmakers could deliver superior returns through 2028 as the AI market matures and diversifies beyond graphics processing units.

While Nvidia has dominated headlines and market sentiment as the primary beneficiary of the AI boom, a structural shift in how enterprises build artificial intelligence infrastructure is creating tailwinds for a broader set of semiconductor suppliers. The transition from centralized GPU architectures to specialized, custom silicon solutions is opening new growth vectors for companies positioned at different layers of the AI supply chain. This divergence suggests the AI infrastructure market may ultimately reward more players than the conventional narrative has suggested.

The Case for Broadcom, AMD, and Micron

Broadcom stands out as a primary beneficiary of the enterprise shift toward custom AI silicon. The company is well-positioned to capture significant revenue from the growing demand for specialized chips tailored to individual company needs. Management projects over $100 billion in AI-related revenue by fiscal year 2027, a figure that underscores the scale of this opportunity. This projection assumes continued adoption of custom chip designs by major cloud providers and tech companies seeking performance advantages and cost optimization in their AI deployments.

Advanced Micro Devices ($AMD) represents another compelling opportunity, despite occupying the distant second position in discrete GPU market share. The company benefits from:

  • Strategic partnerships with leading AI companies including OpenAI and Meta, providing stable demand channels
  • Leadership in data center CPUs specifically architected for agentic AI applications, where processors require different optimization profiles than traditional inference workloads
  • Growing market share in GPU competition as enterprises diversify their AI chip suppliers to avoid single-vendor dependency
  • Cost-competitive alternatives to Nvidia's offerings, creating pricing pressure that encourages adoption

Micron Technology ($MU) operates at a different but equally critical layer of the AI infrastructure stack: memory. The company benefits from several converging trends:

  • Exceptional valuation, currently trading at a discount to peers despite strong AI tailwinds
  • Structural demand growth for high-bandwidth memory (HBM) and advanced DRAM required for AI model training and inference
  • Improved supply dynamics, with longer-term contracts replacing spot market volatility that plagued the sector in prior cycles
  • Market consolidation benefits, as fewer memory suppliers face intense demand from a limited number of large hyperscaler customers

Market Context: The AI Infrastructure Evolution

The conventional perception of the AI boom concentrates benefits on Nvidia, which captured roughly 80% of the discrete GPU market in 2024. However, the infrastructure supporting artificial intelligence is far more complex than GPUs alone. As AI deployments scale and mature, enterprises are discovering that specialized, purpose-built silicon often outperforms general-purpose solutions on key metrics including power efficiency, cost per inference, and application-specific performance.

This trend mirrors historical semiconductor cycles, where initial winners in emerging technology platforms eventually faced competition from specialized providers. The CPU market, for example, began with centralized Intel dominance before ARM processors, custom chips, and AMD gained substantial share as the market matured.

Broadcom's custom chip opportunity reflects this dynamic directly. Major cloud providers—including Amazon Web Services, Google Cloud, and Microsoft Azure—have invested billions in developing proprietary silicon optimized for their specific AI workloads. These companies seek suppliers capable of executing complex chip designs with manufacturing partners like Taiwan Semiconductor Manufacturing Company ($TSM). Broadcom possesses design expertise, customer relationships, and manufacturing coordination capabilities that position it as a natural partner for this trend.

AMD's strength in data center CPUs deserves particular attention as AI applications evolve. The current AI boom emphasizes large language models trained on massive datasets, a workload that favors GPUs. However, the emerging frontier of agentic AI—systems that autonomously complete complex tasks over extended periods—requires substantial CPU resources for decision-making, memory management, and orchestration. AMD's EPYC processor family, which has captured significant data center market share from Intel over the past five years, positions the company to benefit from this architectural shift.

The memory market dynamics supporting Micron reflect a different but equally important trend. AI model training requires enormous memory bandwidth to feed accelerators with data. High-bandwidth memory has transitioned from niche specialty product to critical constraint on AI system performance. Micron's HBM capabilities, combined with its strength in DRAM for data center applications, make it a beneficiary of this bottleneck.

Investor Implications: Diversification and Risk Mitigation

For investors, this analysis suggests several important implications about the structure of AI-driven returns:

Concentration risk in Nvidia is elevated: While the company remains a formidable competitor with substantial market share and brand power, the market's extreme concentration in a single stock has reached levels inconsistent with typical technology cycles. Nvidia's valuation reflects optimistic assumptions about perpetual market share maintenance, leaving limited room for disappointment.

Supply chain breadth creates opportunity: Unlike prior technology transitions where winners emerged early and maintained dominance, the AI infrastructure market appears large enough to support multiple substantial winners across different layers. This increases the probability that disciplined, well-positioned competitors can generate strong returns without dethroning the market leader.

Valuation dispersion suggests mispricing: Broadcom, AMD, and Micron currently trade at more modest valuations than Nvidia relative to their AI exposure and growth rates. This disparity suggests market participants have not fully priced in the contribution of AI revenue to these companies' results.

Macro factors favor diversification: A potential economic slowdown could create pressure on GPU demand if enterprise AI spending moderates. However, companies selling into the infrastructure supply chain at lower system price points might prove more resilient than companies depending on large upfront accelerator purchases.

The semiconductor industry has historically rewarded investors who identified second-order beneficiaries of major technological transitions. The personal computer boom benefited Intel and Microsoft, but also foundries, memory makers, and component suppliers. The smartphone revolution enriched not only Apple and Google but also companies providing display technology, sensors, and manufacturing services. The emerging structure of AI infrastructure suggests this pattern may repeat.

As 2028 approaches, investors should monitor whether these three companies successfully capture the opportunities outlined above. Broadcom's ability to win custom AI chip designs, AMD's progress in CPU-centric agentic AI applications, and Micron's position in memory supply chains will determine whether they can deliver outperformance versus a Nvidia that may face increasing competition and margin pressure as the market matures.

Source: The Motley Fool

Back to newsPublished 3h ago

Related Coverage

The Motley Fool

Amazon and Meta Trade Below Jan. 1 Valuations Despite AI Strength

Amazon and Meta down ~10% YTD despite solid fundamentals and heavy AI investment. Wells Fargo strategist sees attractive opportunity in hyperscalers.

METAAMZN
The Motley Fool

Nvidia's Networking Revenue Explodes 263%: AI Boom Extends Beyond Chips

Nvidia's networking revenue surged 263% YoY to $11B, revealing AI infrastructure demand spans GPUs, networking, and cooling across the entire supply chain.

NVDAANETVRT
The Motley Fool

High-Yield Dividend ETFs Surge Past S&P 500 as Energy Fuels 2026 Rally

Three dividend ETFs crushing S&P 500: $SCHD leads with 10%+ YTD gains, 3.3% yield; $HDV offers concentrated quality with 2.8% yield; $VYM emphasizes stability with 2.3% yield.

XOMCVXAVGO
The Motley Fool

Six Costly Investing Mistakes Financial Pros Say Most Clients Make

CFP identifies six recurring investing mistakes—market timing, chasing performance, excessive cash, poor diversification, emotional decisions, and neglecting rebalancing—costing investors significantly.

NVDAMETAMSFT
The Motley Fool

Micron Stock Slides Despite 771% Earnings Surge: The Disconnect Explained

Micron's 771% earnings surge failed to lift stock price, revealing investor skepticism about cyclical chip market recovery despite exceptional quarterly results.

MU
The Motley Fool

Three AI Stocks Positioned for Potential 10x Growth by 2030

Three AI-focused companies—Nebius, SoundHound AI, and IonQ—offer potential 10x returns by 2030, backed by Meta's $27 billion infrastructure deal and sector tailwinds, though execution risk remains substantial.

NVDAMETAIONQ