AI & ML

C2i Semiconductor Startup Claims It Can Cut AI Data Center Power Waste by 10%

by Suraj Malik - 4 days ago - 5 min read

AI data centers are growing fast. But behind the scenes, a much bigger problem is emerging. Power, not compute, is becoming the real bottleneck.

A new India-based startup, C2i Semiconductors, believes it has a solution. The Bengaluru company is building a “grid-to-GPU” power platform designed to reduce massive energy losses inside AI data centers and improve overall infrastructure economics.

Here is what is happening and why the industry is paying attention.

Why AI Data Centers Are Hitting Power Limits

Artificial intelligence workloads are pushing data centers to consume unprecedented amounts of electricity. Industry projections suggest global data center power demand could nearly triple by 2035.

According to Goldman Sachs, data center electricity usage may rise about 175 percent by 2030 compared with 2023 levels. That increase is roughly equal to adding another top-10 power-consuming country.

But the real issue is not just generating electricity. It is delivering power efficiently to GPUs.

Inside modern data centers, electricity passes through multiple conversion stages before reaching AI chips. Each step wastes energy.

Typical loss today:

  • About 15 to 20 percent of power is lost in conversion
  • For every 1 MW consumed, 150 to 200 kW can disappear
  • Losses mostly occur before power even reaches GPUs

This inefficiency directly increases operating costs and cooling requirements.

What C2i Semiconductors Is Building

C2i, short for Control, Conversion and Intelligence, is trying to redesign how power flows inside AI data centers.

Instead of using separate components for each power conversion stage, the startup is building a unified, plug-and-play system that runs from the data center power bus all the way into the processor package.

In simple terms, C2i wants to treat power delivery as one integrated system rather than many disconnected parts.

Key idea behind the technology

The company is focusing on:

  • Integrating AC-DC and DC-DC conversion
  • Combining control and power management
  • Improving packaging efficiency
  • Supporting higher input voltages (moving toward 800 V and beyond)

By reducing the number of inefficient handoffs between components, C2i believes it can significantly cut energy waste.

How Much Energy Could Be Saved

C2i estimates its architecture could reduce end-to-end power losses by roughly 10 percentage points.

What that means in real numbers:

  • Around 100 kW saved per 1 MW of data center power
  • Lower cooling requirements
  • More usable power for GPUs
  • Better cost per watt of AI compute

At hyperscale levels, even single-digit efficiency gains can translate into massive financial savings.

Peak XV’s Rajan Anandan has noted that reducing energy costs by 10 to 30 percent across the industry could unlock tens of billions of dollars in value.

Funding, Team and Expansion Plans

<span class="wp-element-caption__text">C2i co-founders Vikram Gakhar, Preetam Tadeparthy, Ram Anant, and Dattatreya Suryanarayana (Left to right)</span><span class="wp-block-image__credits"><strong>Image Credits:</strong>C2i</span>
C2i co-founders Vikram Gakhar, Preetam Tadeparthy, Ram Anant, and Dattatreya Suryanarayana (Left to right) Image Credits:C2i

Investors are already backing the bet.

  • Series A: $15 million
  • Total funding: $19 million
  • Lead investor: Peak XV Partners
  • Other investors: Yali Deeptech and TDK Ventures

C2i was founded in 2024 by former Texas Instruments power executives:

  • Ram Anant
  • Vikram Gakhar
  • Preetam Tadeparthy
  • Dattatreya Suryanarayana
  • Harsha S. B
  • Muthusubramanian N. V

The company now has about 65 engineers and is based in Bengaluru, with customer-facing operations being set up in the United States and Taiwan.

What Happens Next

The startup’s first two silicon designs are expected back from fabrication between April and June.

After that, early validation with hyperscalers and major data center operators will begin.

This phase will be critical. In the data center world, power infrastructure changes slowly, and new hardware must prove long-term reliability before large-scale adoption.

Why This Matters for the Future of AI

Once AI data centers are built, electricity becomes the dominant ongoing cost. That makes power efficiency one of the most important levers in AI economics.

If C2i’s technology works as claimed, the impact could include:

  • Lower operating costs for AI infrastructure
  • Higher GPU density per megawatt
  • Reduced cooling burden
  • Better overall data center profitability

Perhaps most importantly, improved efficiency could free up additional compute capacity without requiring new grid connections, which are becoming increasingly difficult to secure globally.

The Big Risk

Power delivery is one of the most entrenched parts of the data center stack. Large incumbents dominate the space, and qualification cycles are long and demanding.

Unlike startups that optimize single components, C2i is attempting a full-stack redesign. That increases both the potential upside and the execution risk.

The next six months, especially the first silicon results and customer feedback, will likely determine whether the company’s approach gains real traction.

India’s Growing Role in Semiconductors

C2i’s emergence also highlights a broader shift. India’s chip design ecosystem is maturing, supported by strong engineering talent and government design-linked incentives.

This environment is making it more realistic for startups to build globally competitive semiconductor products directly from India.

Bottom Line

AI’s biggest constraint is quietly shifting from compute to power. C2i Semiconductors is betting that fixing the energy delivery chain inside data centers could unlock massive efficiency gains.

If its grid-to-GPU platform performs as promised, the startup could become an important player in the next phase of AI infrastructure. If not, it will face one of the toughest qualification gauntlets in the hardware industry.

Either way, power is now the battleground for AI’s future.