The physics mostly work. The timelines, economics, and infrastructure don’t—at least not yet.
 ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏  ͏ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­ ­  
Thursday, February 19, 2026
AI is running out of power. Space won’t be an escape hatch for decades


Welcome to Eye on AI, with AI reporter Sharon Goldman. In this edition: Data centers in space are feasible, but not ready for launch…Accenture links promotions to AI logins…AI pioneer Fei-Fei Li’s startup World Labs raises $1 Billion. Nvidia’s deal with Meta signals a new era in computing power.

The AI industry is on a power trip—literally–and it’s getting desperate. Data centers already account for roughly 4% of U.S. electricity use, a share expected to more than double by 2030 as running and training AI models increasingly require gigawatts of power. Analysts project global data-center power demand could rise as much as 165% by the end of the decade, even as new generation and transmission infrastructure lag years behind need. In response, hyperscalers are scrambling—cutting deals to build their own gas plants, exploring small nuclear reactors, and searching for power wherever they can find it.

Against that backdrop, it’s not surprising that some of the industry’s biggest players are starting to look to outer space for a solution. 

In a feature story published this morning, I dig into how—even as tech companies are on track to spend more than $5 trillion globally on Earth-based AI data centers by the end of the decade—Elon Musk is arguing the future of AI computing power lies in space, powered by solar energy. Musk has suggested that the economics and engineering could align within just a few years, even predicting that more AI computing capacity could be in orbit than on Earth within five.

The idea of orbital space centers itself isn’t new. As far back as 2015, Fortune was already asking the question: What if we put servers in space?

What’s changed is the urgency. Today’s power crunch has pushed the concept back into serious conversation, with startups like Starcloud getting attention and Big Tech leaders like former Google CEO Eric Schmidt, Alphabet CEO Sundar Pichai, and Amazon’s Jeff Bezos all turning their attention to the possibilities of launching data centers into orbit. 

However, while Musk and other bulls argue that space-based AI computing could become cost-effective relatively quickly, many experts say anything approaching meaningful scale remains decades away. Constraints around power generation, heat dissipation, launch logistics, and cost still make it impractical—and for now, the overwhelming share of AI investment continues to flow into terrestrial infrastructure. Small-scale pilots of orbital computing may be feasible in the next few years, they argue, but space remains a poor substitute for Earth-based data centers for the foreseeable future.

It’s not hard to understand the appeal, though: Talking with sources for this story, it became clear that the idea of data centers in space is no longer science fiction—the physics mostly check out. “We know how to launch rockets; we know how to put spacecraft into orbit; and we know how to build solar arrays to generate power,” Jeff Thornburg, a SpaceX veteran who led development of SpaceX’s Raptor engine, told me.  “And companies like SpaceX are showing we can mass-produce space vehicles at lower cost.”

The problem is that everything else, from building massive solar arrays to lowering launch costs, moves far more slowly than today’s AI hype cycle. Still, Thornburg said in the long run, the energy pressures driving interest in orbital data centers are unlikely to disappear. “Engineers will find ways to make this work,” he said. “Long term, it’s just a matter of how long is it going to take us.” 

With that, here’s more AI news.

Sharon Goldman
sharon.goldman@fortune.com
@sharongoldman

FORTUNE ON AI
AI IN THE NEWS

Accenture links promotions to AI logins. Accenture is beginning to track senior employees’ use of its internal AI tools—and factoring that data into leadership promotion decisions—highlighting how even AI-heavy consultancies are struggling to get top staff to change how they work. According to internal communications seen by the Financial Times, promotion to leadership roles will now require “regular adoption” of AI tools, with Accenture monitoring individual log-ins for some senior managers as part of this summer’s talent reviews. The move reflects a broader challenge across consulting and accounting firms, where executives say senior partners are far more resistant to AI adoption than junior staff, prompting a “carrot and stick” approach. While Accenture says it has trained more than 550,000 employees in generative AI and is reorganizing around an AI-centric “Reinvention Services” unit, the policy has drawn internal criticism—including claims that some tools are unreliable—and underscores the widening gap between AI ambition and day-to-day enterprise use.

AI pioneer Fei-Fei Li’s startup World Labs raises $1 Billion. Bloomberg reported that World Labs, a startup founded by AI pioneer Fei-Fei Li, has raised $1 billion in new funding to pursue “world models,” an approach aimed at helping AI systems reason about and operate within the three-dimensional physical world. The round included a $200 million investment from Autodesk, alongside backing from Andreessen Horowitz, Nvidia, and Advanced Micro Devices, according to the company. World Labs joins a growing cohort of startups focused on world models, including a venture led by Yann LeCun, as investors look beyond large language models toward AI systems better suited for robotics and scientific discovery. The company launched its first product, Marble, late last year, which generates 3D environments from text or image prompts, and says the new capital will accelerate work in those areas. Li is best known for her role in creating ImageNet, a foundational dataset that helped drive modern breakthroughs in computer vision; the startup did not disclose its valuation, though Bloomberg News previously reported it had been in talks around a roughly $5 billion figure.