Ignite: Last Week — March 29, 2026
Somewhere in Louisiana this week, a utility company and a tech giant signed paperwork to build new gas plants, expand transmission lines, and uprate nuclear capacity. Not because anyone was running out of electricity. Because Meta needs the grid to keep up with its data centers, and it agreed to pay the full cost of making that happen.
That is the state of AI infrastructure in 2026. The constraint is not the model. It is the wire in the ground.
This matters beyond Meta. Every startup whose business depends on cheap, abundant inference sits downstream of this problem. When the marginal cost of running AI depends on whether a utility commission approves a new substation, “AI unit economics” are no longer purely a software question. They are partly a zoning question. The companies that reduce compute requirements, shift workloads on-device, or architect around power availability are going to have advantages that are not obvious yet on a pitch deck.


