- The Midas Report
- Posts
- Big Moves as Nvidia & OpenAI Just Committed the Power of 10 Nuclear Plants to AI
Big Moves as Nvidia & OpenAI Just Committed the Power of 10 Nuclear Plants to AI
Is this the moment AI shifts from experimentation to an all-out industrial arms race?

Good morning from the The Midas Report.
Like caffeine, but for understanding AI.
Nvidia has just injected it’s own shot of caffeine into OpenAI.
Because when Nvidia and OpenAI say they’re building ten gigawatts of AI infrastructure, they’re not just flexing.
That’s roughly the energy output of ten nuclear power plants, all pointed at one thing, accelerating AI.
This is the single biggest hardware commitment to AI we’ve seen. Not just more chips, but entire data center ecosystems, all purpose built to train and run ever larger frontier models.
It’s the clearest sign yet that AI is entering its industrial age, and the costs to compete are about to get stratospheric.
For Nvidia, this marks a shift from arms dealer to co architect. It’s no longer content with just selling GPUs. Now it’s embedding itself upstream in deployment and downstream in how models reach developers.
That influence will ripple across everything from funding dynamics to pricing power. If you’re a startup that’s just wrapping APIs around other models, this should make you sweat.
OpenAI, for its part, may be anchoring the next phase of its expansion in a bespoke, Nvidia powered hardware stack, which could give tighter integration and better throughput than relying on generic cloud infrastructure.
Translation, performance will matter more, scale differences will widen, and latency becomes a moat.
This also changes who gets to innovate, and how. If access to massive scale compute is only available through a few partnership channels, say, Azure, AWS, or OpenAI API, then distribution may matter more than originality, at least in the short term.
Smart founders will optimize for what runs well on these rails, not just what sounds good in a demo.
And you’ll want to pick your battles carefully. Inference cost is the quiet killer right now, and it’s about to get louder.
With supply still tight and GPU demand not slowing down, the top performers won't just be fast.
They’ll be cheap to serve, too.

🧠 The Download
Ex Palantir founders raise $175M to scale AI infrastructure startup Distyl who sells smarter operations, embedding AI deep into legacy enterprise workflows where generative tools usually struggle to scale.
Citi builds in house AI agents into its enterprise OS Stylus and by designing agentic tools around its own workflows across compliance, risk, and ops, Citi is ushering in the era of institutional AI systems built from the ground up.
Google targets enterprise integration with DORA’s new AI Capabilities Model. Google’s play is about mapping business logic and context straight into AI native systems.

Nvidia lit up Wall Street with a 7.2% stock surge after locking in a $100 billion multi year compute deal with OpenAI and an additional $5 billion chip supply pact with Intel.
Distyl AI made noise with a $175 million Series B that catapulted its valuation to $1.8 billion, up 9x from its last round, staking a bold claim in enterprise AI automation.
Fetch.ai and SingularityNET bounced back 4% and 6.8%, respectively, helping push the AI token index up 5.6% as OpenAI’s spending spree reignited interest in decentralized compute.
That’s your daily AI edge for September 23rd 2025.
See you tomorrow!
Midas AI