• Momentum Notes
  • Posts
  • The Hidden Costs of AI: Power, Money, and the Planet

The Hidden Costs of AI: Power, Money, and the Planet

Beyond the magic: understanding the infrastructure and impact behind artificial intelligence

AI often feels like magic. You open a chat window, type a question, and within seconds, a neatly packaged answer appears. It feels effortless, invisible—just words on a screen. But behind the curtain, there’s nothing simple about it.

The truth is, artificial intelligence runs on massive infrastructure, energy, and investment. It isn’t conjured from thin air. Every response from ChatGPT or other LLMs is powered by vast data centers, specialized chips, and an immense supply of electricity. And with each leap in capability—GPT-5, Claude 3, Gemini Ultra—the costs rise, not just financially but environmentally.

In conversations I’ve had with peers, friends, and clients, a few questions keep coming up:

  • How is AI actually powered?

  • What does it cost to run something like ChatGPT?

  • And what impact does this have on the planet?

“AI is neither artificial nor intelligent… There is an enormous environmental footprint … It’s profound materiality.”

Kate Crawford – Author of Atlas of AI

These are good questions. They’re also complicated ones. No single person or company has all the answers. But what we can do is peel back the layers to see AI for what it is—not just a convenient tool, but a technology with a real footprint.

How AI Is Powered

At its core, AI systems like ChatGPT are built from four main ingredients: data, algorithms, compute power, and energy.

1. Data
AI models are trained on massive datasets—text from books, articles, websites, conversations, code, and more. The diversity and quality of this data shape what the model can and cannot do. It’s why AI can generate fluent paragraphs or translate languages, but often struggles with nuance, context, or original insight.

2. Algorithms
The brains of modern AI are large-scale neural networks, particularly transformer architectures. These algorithms recognize patterns across billions of data points and use those patterns to generate new content. When you type a prompt, the model predicts, word by word, the most likely response based on what it has learned.

3. Compute Power
This is where things get intense. Training an AI model requires thousands of GPUs (graphics processing units) or TPUs (tensor processing units), running in parallel for weeks or even months. Each chip processes millions of calculations every second. It’s like building a factory of computation, running nonstop until the model is trained.

4. Energy
All of this requires electricity—not just to power the chips, but also to cool the data centers so they don’t overheat. Data centers are like digital power plants, humming with servers, fans, and cooling systems. The energy footprint is massive, and it continues long after training, because every single user query also requires computation.

So when you ask ChatGPT to draft an email or summarize a report, you’re not just pulling words out of thin air. You’re tapping into a global infrastructure of data, chips, and power.

What AI Costs

The financial cost of AI varies depending on whether you’re training a model or using one.

Training Costs

Training a frontier model is astonishingly expensive. Estimates suggest that GPT-4 cost well over $100 million to train. GPT-5, by some accounts, is even more resource-intensive.

Why so expensive? Because training involves:

  • Renting time on vast fleets of cloud GPUs.

  • Storing and processing massive datasets.

  • Paying world-class researchers and engineers to build and optimize the system.

This is why only a handful of players—OpenAI, Google, xAI, Anthropic, Meta—are building frontier models. The barriers to entry are enormous.

Operating Costs (Inference)

Even after training, running AI isn’t cheap. Every time you ask a question, the system consumes computational resources. Each query might cost fractions of a cent to a few cents. That doesn’t sound like much until you multiply it by hundreds of millions of daily queries.

At scale, inference costs billions per year. Microsoft, Google, and OpenAI are pouring massive sums into maintaining the infrastructure that keeps these models running 24/7.

Business and Individual Costs

For businesses and individuals, the picture looks different.

  • Small scale: Using APIs from OpenAI, Anthropic, or Google typically costs between $0.001 and $0.12 per 1,000 tokens (roughly 750 words). That means a small team experimenting with AI could spend anywhere from a few dollars to a few hundred dollars per month.

  • Larger enterprises: Costs scale quickly. A company deeply integrating AI into customer service, product development, or analytics could easily spend tens of thousands per month.

  • Building your own model: For most companies, this is out of reach. The compute, data, and talent costs are simply too high. Instead, most choose to build on top of existing APIs.

The takeaway? AI isn’t “free” or even “cheap” at scale. The business model works because millions of small payments add up, subsidizing the enormous infrastructure behind the scenes.

The Environmental Equation

Beyond money, AI also comes with an environmental price tag.

Energy Use

Training a single large model consumes as much electricity as hundreds of U.S. households might use in a year. Running these models—answering millions of queries every day—adds even more.

Cooling systems are another hidden cost. Servers generate heat, and data centers often require water cooling or advanced HVAC systems to stay operational. In some cases, data centers draw on local water supplies, raising concerns about sustainability in drought-prone regions.

Comparisons to Other Industries

AI isn’t the only digital technology with a footprint. Cryptocurrency mining, cloud storage, and video streaming also consume massive amounts of energy. But AI’s growth trajectory is sharp—it’s scaling faster than most infrastructure was designed to handle.

A recent study estimated that by 2030, AI data centers could consume as much electricity as some medium-sized countries. That doesn’t make AI inherently bad, but it does highlight the need for innovation in energy efficiency and renewable integration.

Sustainability Efforts

To their credit, many tech giants are working on solutions. Google, Microsoft, and Amazon are investing in renewable energy and experimenting with more efficient chips. But the reality is: we’re early in the AI era, and the sustainability equation is still unresolved.

For builders and creators, this raises important questions: How much AI is enough? And how do we balance convenience with responsibility?

Roadblocks to Adoption

Even as AI adoption accelerates, several major challenges stand in the way.

1. Compute and Energy Demand
The chips required to train and run models are expensive and in short supply. Demand is outpacing production, driving up costs and limiting access.

2. Cost Barriers
Because frontier models are so expensive, only a handful of companies can build them. This centralizes power and limits diversity in how AI evolves.

3. Data Access and Quality
Training requires massive datasets, and sourcing them raises issues of copyright, bias, and misinformation. The “garbage in, garbage out” problem is real.

4. Ethics and Regulation
Governments are scrambling to regulate AI around bias, misinformation, and job displacement. Businesses adopting AI must navigate a shifting landscape of rules and expectations.

5. Integration and Skills Gap
AI is powerful, but it’s not a magic wand. Many organizations lack the expertise to implement it effectively. Without clear workflows and human oversight, results are inconsistent.

6. Trust and Transparency
AI often feels like a “black box.” Users don’t always know how outputs are generated, making it hard to trust in high-stakes areas like healthcare, finance, or law.

Clarity in the Complexity

So, where does this leave us?

AI isn’t free. It isn’t invisible. And it certainly isn’t without consequences. It requires immense infrastructure, costs billions to run, and has a real environmental footprint. At the same time, it offers extraordinary leverage—helping businesses, creators, and entrepreneurs amplify their work in ways that were impossible a few years ago.

As builders, the challenge isn’t to reject AI outright, nor to blindly embrace it. The challenge is to adopt it with awareness. To ask hard questions:

  • How am I using AI?

  • What does it actually cost—financially, environmentally, socially?

  • How can I use AI to build something that provides true value?

Clarity—not hype, not fear—is what we need most.

We don’t yet know the full costs of AI, just as we didn’t fully grasp the long-term costs of the internet or smartphones in their early days. But we can make intentional choices now. We can balance innovation with responsibility. We can remember that behind every prompt and output is an infrastructure that affects not just businesses, but the world we live in.

The hidden costs of AI don’t make it bad. They make it real. And as modern builders, our job is not just to use the tools available—it’s to use them wisely, with clarity, creativity, and intention.