Summary
Artificial intelligence has a massive hunger for electricity, and we are beginning to run out of places to plug it in. This simple fact—ignored by most people focused on AI's capabilities—is reshaping how and where artificial intelligence will be built in the coming years.
To understand the scale of the problem, imagine this: A single large artificial intelligence data center uses as much electricity as roughly one hundred thousand homes. The biggest data centers being built right now will use twenty times that amount. These are not small numbers.
By the year 2028, artificial intelligence systems alone will consume somewhere between 165 billion and 326 billion kilowatt-hours of electricity annually. This is more electricity than most countries use in a year.
Today, in Ireland, data centers are already using about one-fifth of all the electricity the country produces.
By 2026, this number is expected to rise to one-third. Ireland is not the only place facing this problem. Virginia in the United States, certain areas of northern Europe, and parts of Asia are all beginning to feel the strain as artificial intelligence facilities demand more and more power.
This energy crisis is forcing engineers and companies to look for solutions in three very different places.
The first solution is looking up.
Since 2024, companies have been seriously planning to put data centers in space. Starcloud, a company backed by Nvidia, actually trained an artificial intelligence model on a satellite in orbit around Earth in December 2025.
This might sound like science fiction, but the basic idea is practical. Solar panels in space get constant sunlight. They do not have clouds blocking them, and they do not have the atmosphere reducing the sunlight's strength. A solar panel in space can create about five times more electricity than the same panel on Earth.
Even better, heat from the computers does not need expensive air conditioning systems. In space, the heat just radiates out into the vacuum of space, which is about minus 270 degrees Celsius.
Think of it like opening a window that leads directly into the coldest place possible. Google announced a similar project called Suncatcher, planning to launch satellites with solar panels and computing chips connected together. The electricity would cost about one-half cent per kilowatt-hour, roughly fifteen times cheaper than what electricity costs today in most places.
The second solution is looking down.
Since 2024, China has been operating underwater data centers off the coast near Shanghai.
The facility is powered entirely by wind turbines offshore and cooled using seawater. Instead of using energy-intensive air conditioning systems that consume 40-60 % of a data center's electricity, undersea data centers use seawater flowing through radiators on the side of computer racks.
This cuts cooling electricity use to less than ten percent. The entire facility uses about 30 %less electricity than a land-based data center of the same size. Plus, there is plenty of room underwater, and you do not need to fight with local communities about taking up valuable land or using fresh water for cooling.
The third solution is rethinking what artificial intelligence actually needs.
Instead of building bigger and bigger artificial intelligence models, companies are starting to build smaller, specialized models.
A large language model like GPT-4 has hundreds of billions of parameters, meaning it has hundreds of billions of separate computational decision points. But most real-world tasks do not need all that complexity. If you are sorting customer complaints into categories, answering routine questions, or checking whether information is correct, you do not need a model that understands everything about everything.
Nvidia research has shown that smaller models are actually better at these narrow tasks. A small model can complete tasks ten to thirty times faster, use ten to thirty times less electricity, and is easier to update and fix.
Imagine having a doctor who is fantastic at diagnosing any disease in the world versus having twenty doctors, each an expert in a specific area. For most problems, the specialized doctors are actually better and faster.
There is also something called neuromorphic computing, which is inspired by how the human brain works. Your brain uses about twenty watts of power. It can learn, remember, and recognize patterns better than any computer.
Neuromorphic chips try to copy how the brain does this, putting memory and computation close together instead of having them separated like traditional computers. A neuromorphic chip called AI Pro uses only twenty-four microjoules for certain tasks, which is about ten times less electricity than competing systems.
Another approach is called model quantization. This is like storing information with less precision. A normal model might store numbers with sixty-four separate binary digits. A quantized model might use only four digits.
You lose a tiny bit of accuracy, but you save enormous amounts of electricity and memory. Nvidia has created a new format called NVFP4 that can do this so well that it saves about fifty times the energy per task while keeping the results almost exactly the same.
These solutions matter because they address a real problem that is approaching quickly.
Companies and governments are realizing that powering artificial intelligence has become a strategic national priority, just like building roads or supplying electricity used to be.
NextEra Energy, a major American energy company, just partnered with Google to build giant data center complexes with their own electricity generation directly attached. Brookfield and Bloom Energy announced a five-billion-dollar partnership to build "artificial intelligence factories" where power generation and computation are designed together from the start.
What makes this shift interesting is that it changes where artificial intelligence development happens.
Putting data centers in space or underwater or distributing them across many small edge locations means that artificial intelligence is becoming less centralized.
Instead of a few giant companies controlling computation in a few giant facilities, computation is spreading out. This creates both opportunities and risks.
Governments can control artificial intelligence within their borders more easily when the computers are physically located there and powered by local electricity. But it also means that every country might want to build its own artificial intelligence systems, creating duplication and making it harder for different systems to work together.
The really important point is that electricity is not an afterthought in artificial intelligence development. It is becoming the main limiting factor. The most intelligent, capable artificial intelligence systems in the future will not be determined by who has the smartest programmers or the best algorithms.
They will be determined by who can access the most reliable, sustainable, and cheap electricity. This is shifting the competition from primarily a software and algorithm race to a competition about energy infrastructure.
The path forward likely involves all three approaches. Space-based data centers will probably not be operating at large scale until 2028 or 2030. Underwater data centers are already operating but face environmental concerns.
Smaller, distributed models and edge computing are happening right now. But together, these approaches suggest that artificial intelligence in the future will look very different from artificial intelligence today. It will be distributed rather than concentrated, specialized rather than monolithic, and deeply connected to energy infrastructure in ways that were not true before.
This is not a small change. It is one of the most significant shifts in how computation will be organized in the coming decade.

