Peak XV backs Indian startup C2i to solve power bottlenecks for AI data centers

The grid is screaming. We spent the last two years obsessed with the "intelligence" part of artificial intelligence, marveling at bots that can hallucinate a legal brief or generate a cat in a tuxedo. We forgot about the copper. We forgot about the heat. Now, the bill is coming due, and it’s written in megawatts.

Silicon Valley’s current strategy for the AI power crunch is essentially "buy a nuclear reactor and hope for the best." Microsoft is resurrecting Three Mile Island. Amazon is squatting next to a 900-megawatt plant in Pennsylvania. It’s a desperate, high-stakes land grab for electrons. But while the giants are busy LARPing as utility moguls, Peak XV—the venture firm that used to wear the Sequoia India badge—is looking for a different exit ramp. They just led a significant round for C2i, an Indian startup promising to fix the bottleneck before the whole system melts down.

Let’s be clear: this isn’t about "saving the planet." It’s about the fact that an Nvidia H100 chip pulls about as much power as a small family home. When you stack 10,000 of them in a room, you aren't running a data center anymore. You’re running a giant, expensive space heater that occasionally does math.

The industry has hit what engineers call the "power wall." We can’t just keep throwing more electricity at the problem because the physical infrastructure—the literal wires in the ground—can't handle the load. This is where C2i comes in. Based out of the engineering hubs of India, they aren't trying to build a better LLM. They’re working on the plumbing. Specifically, they’re focusing on "Compute-to-Interconnect" efficiency.

The friction here is simple and ugly. Right now, about 40 percent of a data center's energy doesn't go toward thinking. It goes toward moving data between chips and keeping those chips from liquefying. It’s a massive, hidden tax on every prompt you type. C2i claims their architecture can slash that overhead by rethinking how silicon talks to silicon. It’s the kind of deep-tech grunt work that doesn't make for a sexy demo, but it’s the only thing that keeps the $40,000 chips from becoming very shiny paperweights.

Peak XV’s bet is a calculated one. They know the geography of this crisis is shifting. In the US, building a new data center takes years of permit battles and grid studies. In India, the grid is already a chaotic, overtaxed mess. If you can make AI work in an environment where power is expensive and reliability is a suggestion rather than a rule, you can make it work anywhere.

But there’s a catch. There’s always a catch. C2i is fighting against the laws of physics and the momentum of an industry that moves fast and breaks things—mostly transformers and circuit breakers. To make their tech a standard, they have to convince the hyperscalers to ditch their current setups for unproven Indian silicon architecture. That’s a hard sell when Google and Meta are already billions of dollars deep into their own proprietary designs.

The price tag for this power crisis isn't just the utility bill. It’s the trade-off. Every megawatt diverted to a cluster training a "smarter" version of GPT-5 is a megawatt not going to the actual, physical economy. We’re reaching a point where the digital world is cannibalizing the physical one. We’re building cathedrals of compute while the air conditioners in the real world start to flicker.

C2i says they can bridge the gap. They talk about "energy-proportional computing" and "silicon photonics" like they’re the holy grail. Maybe they are. Or maybe they’re just another startup trying to put a bandage on a geyser. Peak XV is putting tens of millions of dollars behind the idea that the bottleneck isn’t the code, but the cord.

It’s a nice thought. We’ve spent forty years assuming that compute was an infinite resource, a digital magic trick that would only get cheaper and faster forever. Now we’re finding out that the "cloud" is actually just a lot of very hot metal sitting in a field, praying the fans don’t stop spinning.

If C2i fails, the AI revolution won't end because of a lack of data or a lack of genius. It will end because we couldn't find a big enough plug.

The real question is whether we’re actually trying to solve the power problem, or if we’re just looking for a more efficient way to burn the house down.

Advertisement

Latest Post


Advertisement
Advertisement
  • 467 views
  • 3 min read
  • 22 likes

Advertisement
About   •   Terms   •   Privacy
© 2026 DailyDigest360