Scott J. Hunter

Exploring the intersection of mysticism, technology, consciousness, and art

Contact Me

Can AI Get Smart Enough to Save Itself Before the Lights Go Out?

AI representation of a half-built data center in Fort Meade.
AI representation of a half-built data center in Ft. Meade.

Florida is becoming a battleground for two very different visions of computing's future. Across the state, developers are proposing massive AI data centers on wetlands, farmland, and former industrial land, with utilities such as Duke Energy and Florida Power & Light caught in the middle of the power question. One proposed facility in Fort Meade would require 1.2 gigawatts of electricity, roughly twice what it takes to keep Tallahassee running. Communities are pushing back, some proposals are being withdrawn, and Florida Power & Light is moving to rezone more than 5,700 acres in Indiantown for utility and technology-related development that could include data processing centers. Meanwhile, a few hundred miles away, engineers are stringing together a nearly 100-mile quantum-safe network corridor along existing Florida LambdaRail fiber lines. Same state, same moment, two futures arriving simultaneously.

The Fort Meade project is not an outlier. It is a symptom. Globally, data centers consumed about 415 terawatt hours of electricity in 2024, according to the International Energy Agency, and that number is projected to more than double to about 945 terawatt hours by 2030, slightly more than Japan's total electricity consumption today. AI workloads are growing even faster than the broader data center sector. The environmental costs go beyond electricity. Researchers at UC Riverside and Caltech estimated that training a model at the scale of Llama 3.1 produced air pollution comparable to more than 10,000 round trips by car between Los Angeles and New York. Water is becoming a constraint too: another UC Riverside-led study found that by 2030, data center cooling systems could require 697 million to 1.45 billion gallons of additional peak water capacity per day in the United States if efficiency does not improve. Back in Florida, lawmakers are already trying to prevent data centers from shifting utility costs onto residents, and PolitiFact found experts warning that, without protections, large industrial loads can raise power bills across a utility's service territory. And the Indiantown site alone documented wood storks, gopher tortoise activity, sandhill cranes, and a bald eagle's nest on land developers were prepared to build on.

Meanwhile, the companies building this infrastructure are discovering they may have gotten ahead of themselves financially. On April 27, Reuters reported on a Wall Street Journal story that OpenAI, the most prominent name in artificial intelligence, had fallen short of internal user and revenue goals. More troubling, the report said CFO Sarah Friar had privately warned colleagues that OpenAI might not be able to pay for future data center contracts if revenue does not grow fast enough. This is a company reportedly committed to spending roughly $600 billion on compute infrastructure by 2030. The math is stark. OpenAI was projected to spend roughly $22 billion against $13 billion in sales, burning about $1.70 for every dollar it made, while cumulative losses before profitability could reach $143 billion. The AI industry built its business model on the assumption that explosive user growth would eventually justify explosive infrastructure spending. That assumption is now being tested in public, in real time, ahead of a possible IPO that could value the company near a trillion dollars. When the market leader has to worry about the electric bill, the whole industry has a problem.

There is a potential way out, but it is being built on a very different timeline. Quantum computing operates on principles that make today's GPU-driven data centers look like gas-guzzling engines in an electric world. Where conventional AI hardware consumes power almost linearly as it scales, quantum systems can perform certain complex computations exponentially faster using a fraction of the energy. D-Wave announced in 2025 that its annealing quantum computer had solved a magnetic materials simulation in minutes, a task the company said would have taken a classical GPU supercomputer nearly one million years and more electricity than the world uses annually. D-Wave also says its annealing quantum systems have held steady at about 12.5 kilowatts of system power across multiple generations. That claim is debated, as quantum claims usually are, but even the argument marks a different category of technology entirely. IonQ, a publicly traded quantum computing company, just received a DARPA contract to work on high-speed interconnects capable of linking multiple types of quantum computers together, a critical step toward making the technology scalable. Simultaneously, IonQ and Florida LambdaRail announced the nearly 100-mile Florida corridor, the first phase of a statewide quantum-safe network initiative.

Here is where it gets interesting. The same artificial intelligence that is straining the power grid may eventually be the thing that solves the problem. AI is already accelerating materials science, drug discovery, and chip design in ways that would have taken human researchers decades. The quantum computing problem, at its core, is an engineering challenge involving noise reduction, error correction, and scalability. Those are precisely the kinds of complex optimization problems that increasingly capable AI systems are getting good at. The scenario is not far-fetched: a sufficiently advanced AI, given access to enough data and computing resources, might identify the shortcuts and architectural breakthroughs that human researchers are missing, potentially collapsing the timeline to viable quantum computing. But there is a catch. The AI has to survive long enough, and consume enough resources, to get smart enough to solve the problem of its own resource consumption. It is a little like a rocket that has to burn most of its fuel just to escape gravity, hoping there is enough left to reach orbit. The window may be narrower than anyone in the industry wants to admit.

Nobody knows how this ends. The AI industry is spending money it does not yet have, on infrastructure that is consuming resources the planet can barely spare, racing toward a capability threshold that may or may not arrive before the economics collapse. Quantum computing is real, it is advancing, and it has serious government backing. But it is not arriving this decade in any form that will rescue the current generation of data centers. The most honest thing anyone can say right now is that we are in the middle of the most expensive bet in the history of technology, and the outcome is genuinely uncertain. What is certain is that Florida's wetlands are already under pressure, the electric bill question is already political, and the bald eagle's nest in Indiantown was nearly just a casualty of someone's quarterly earnings target. Whether the rocket has enough fuel to reach orbit is a question that will answer itself. We just have to watch.

(...)