Byte-Sized Intelligence December 4 2025

What happens when AI runs out of power?

This week, we examine the grid and cooling constraints behind the AI boom and the three-sided model that shows why growth now feels uneven.

AI in Action

AI’s Hard Limit, the Energy Wall [AI Infrastructure/Scaling]

AI has spent years racing ahead on bigger models and faster chips, but now it is running into something more stubborn than compute: the world’s available power. In several regions, new data center projects are being slowed or paused because local grids cannot safely carry more load. Water use for cooling is climbing in places that rely on evaporative systems, enough to prompt fresh sustainability reviews. If you want a deeper look at why AI systems demand so much power and water, we explored the physical footprint of data centers in an earlier issue, and those pressures have only intensified as generative AI has scaled. Companies are also learning that the crunch sits not in training but in inference, the cost of running billions of daily queries. As AI becomes a quiet layer inside everyday software, the energy required to keep it running begins to look like the real ceiling.

That ceiling is already reshaping the economics of AI. Cloud providers are no longer racing only for chips, they are racing for long term power agreements, cooling capacity, and locations with predictable environmental conditions. Inference costs may rise with energy prices, pushing companies to rethink usage limits and pricing models. And nations with abundant clean power such as hydro rich Canada, geothermal Iceland, or nuclear heavy France suddenly gain an advantage that is less about clever models and more about stable electrons. Access to power becomes a strategic moat, tightening the industry around the few firms that can secure long term energy at scale.

The energy wall raises environmental and legitimacy questions about whether every new model justifies its water and carbon footprint. It also shifts where the next breakthroughs may come from. Progress will not be measured only in parameter counts, but in efficiency, from better cooling systems and lower power chips to advances in storage or clean generation. AI will keep advancing, but its trajectory will be shaped as much by physics and infrastructure as by algorithms. The future of AI may depend less on how smart models become and more on the world’s ability to power them.

Bits of Brilliance

The Power Triangle [AI Infrastructure/Concept]

When people talk about scaling AI, the conversation usually goes straight to GPUs and bigger models. But there is a simpler way to understand why the industry keeps hitting physical limits. Think of the system through a power triangle — a lens on how compute, cooling, and electricity work together inside data centers. It is not an official framework, but because AI workloads are far denser and hotter than traditional cloud computing, the triangle helps explain why scaling AI now stresses infrastructure in ways the industry did not expect. AI can only grow as fast as its tightest bottleneck, and lately, all three sides are feeling it.

The first leg is compute, the GPUs doing the math. AI clusters pack so much hardware into each rack that power density, the energy load per square foot, surges. That heat brings us to the second leg, cooling. Modern AI chips run close to their thermal limits, and data centers can only push them as hard as their cooling systems can pull heat away, whether through heavy water use or more advanced liquid and immersion designs. Thermal ceilings often arrive before electricity does, which means cooling can become the first hard stop. The third leg is power. Even with the right chips and cooling, some utilities cannot deliver the extra megawatts AI-scale workloads now demand, slowing expansion regardless of how many GPUs companies would like to install.

Seeing AI through this triangle clarifies why growth feels uneven around the world. Some regions have cheap, clean power but limited cooling capacity. Others have strong cooling systems but constrained grids. And it points to where the next breakthroughs are likely to come from: lower power chips, better thermal management, and new ways of generating and storing energy. The power triangle is not a rulebook, but it is a sharp lens for understanding why the future of AI will be shaped as much by physics and infrastructure as by clever algorithms.

Curiosity in Clicks

Invent a new word for your feeling [Fun/Experiment]

Think of a feeling you’ve had recently that doesn’t have a clean name. The moment between relief and regret, the quiet dread before opening your inbox, or the odd calm that follows a tough conversation. Think about what word you would come up with first, then ask your AI chatbot to create a brand new word for it. Tell it to make the word sound real, give it a definition, and use it in a sentence. Let me know what new word you’ve invented!

Here is an example. “Create 3 believable new words that captures this feeling: when you complete something, a mix feeling of accomplishment and whether you did your best. Give it a definition, explain why you chose it, and use it in a sentence.” You can also ask to make it more”poetic”, or “Gen-Z”.

My favourite is: Afterglint - the soft shimmer of pride that appears the moment you finish something, flickering with the quiet question of whether it could have shone even brighter.

Byte-Sized Intelligence is a personal newsletter created for educational and informational purposes only. The content reflects the personal views of the author and does not represent the opinions of any employer or affiliated organization. This publication does not offer financial, investment, legal, or professional advice. Any references to tools, technologies, or companies are for illustrative purposes only and do not constitute endorsements. Readers should independently verify any information before acting on it. All AI-generated content or tool usage should be approached critically. Always apply human judgment and discretion when using or interpreting AI outputs.