AI’s appetite for power is insatiable. Nvidia’s CEO admits they vastly underestimated how much compute AI would need—now it’s 100 times more than last year’s expectations. The AI revolution isn’t just accelerating; it’s consuming everything in its path.
Nvidia’s dominance is undeniable. Hopper GPUs shipped 1.3 million last year, but Blackwell is already at 3.6 million in 2025. The demand is relentless, and the stakes are higher than ever.
This is an arms race in silicon. AI isn’t just advancing—it’s devouring resources at an unprecedented pace. The future belongs to those who can keep up.
$NVDA DC REVENUE VS DATA CENTER CAPEX PROJECTION pic.twitter.com/LQlA3Lur98
— Wall St Engine (@wallstengine) March 18, 2025
$NVDA, CUDA-X FOR EVERY INDUSTRY pic.twitter.com/SpgTtgb2X2
— Wall St Engine (@wallstengine) March 18, 2025
Nvidia's Hopper shipments to its top 4 customers in 2024 reached 1.3 million, while Blackwell GPU shipments so far in 2025 have already hit 3.6 million.$NVDA CEO: "AI is going through an inflection point." pic.twitter.com/gVg5VozbJm
— Wall St Engine (@wallstengine) March 18, 2025
$NVDA CEO: 'The computation requirement of the scaling law of AI is more resilient and, in fact, hyper-accelerated. The amount of computation we need at this point, as a result of agentic AI and reasoning, is easily a hundred times more than we thought we needed this time last… pic.twitter.com/2rwn4B0SAw
— Wall St Engine (@wallstengine) March 18, 2025
$NVDA CEO on scaling laws: They're more resilient than we ever imagined, and we need 100x more compute for reasoning AI inference than we originally thought. 'Each one of these waves opens up new market opportunities for us,' says Huang. pic.twitter.com/jugjcXkfQY
— Wall St Engine (@wallstengine) March 18, 2025