Reporting From The Future

The AI Gold Rush Is Choking on Power, But Liqid Thinks It Has the Cure

Hammer and Liqid are betting that smarter architecture can outpace AI’s growing appetite for power. But as the world builds ever-denser data centers to feed its algorithms, efficiency may prove to be the illusion that keeps the machine running

The AI infrastructure boom is creating a strange new contradiction: the smarter our systems become, the more energy and hardware they devour. Into this tension steps Hammer Distribution, which has announced a new partnership with Liqid, the Colorado-based pioneer of “software-defined composable infrastructure.”

On paper, it’s a match made for the AI era. In practice, it exposes the growing divide between the promise of efficient computing and the physical limits of power, cooling, and cost.

The deal, unveiled on October 15, will see Hammer, formerly Exertis Enterprise, distribute Liqid’s Matrix™ software and CXL-based composable memory solutions across the U.K., Ireland, Benelux, and the Nordics.

Liqid’s pitch is seductive: rather than treating servers as rigid boxes of hardware, it allows data centers to “dynamically pool and share GPUs, memory, NVMe storage, and accelerators”, letting enterprises scale on-premises infrastructure as flexibly as a public cloud.

“We are thrilled to partner with Liqid and bring their transformative technology to our partners,” said Adam Blackwell, Hammer’s Director of AI, Server, and Advanced Technology. “The demand for solutions that can handle data-heavy workloads like AI is growing exponentially, and Liqid’s composable infrastructure is perfectly positioned to meet this need.”

In other words: as companies drown in the computational demands of generative AI, Liqid offers a lifeboat — or at least, a more efficient engine. The company claims that its composable architecture can reduce total cost of ownership by up to 75%, scale to 30 GPUs per server, and extend DRAM capacity to 100 terabytes — all while supporting 600W GPUs like NVIDIA’s H200 or Intel’s Gaudi 3 at full performance. For CTOs facing spiraling energy bills and GPU shortages, that’s not just innovation; it’s survival.

Still, the hype around “composability” deserves scrutiny. Liqid’s software promises “tokens per watt” and “tokens per dollar” optimization, a clever nod to AI’s economics, but the reality is more complex. AI inference workloads, which now represent the bulk of enterprise demand, are pushing the limits of physical data centers.

Power grids in the U.K. and Europe are already strained, with data centers expected to account for over 10% of national electricity consumption by 2030, according to the International Energy Agency. Reconfiguring GPUs won’t solve that systemic problem; it merely buys time.

“We are proud to partner with Hammer, whose reach and expertise are unmatched,” said Edgar Masri, Liqid’s CEO. “Together, we will help enterprises across EMEA harness the full potential of GPUs and memory, enabling them to scale inference workloads efficiently, reduce costs, and accelerate innovation.” It’s an optimistic vision, one that assumes enterprises can continue to scale indefinitely, as long as the architecture is smarter.

But that assumption is exactly what deserves challenging. Composable infrastructure, for all its elegance, doesn’t escape the reality of physical bottlenecks. Cooling 30 GPUs per server, even with next-gen efficiency, still consumes staggering amounts of water and power. The AI infrastructure arms race, now a $100 billion global market, risks reproducing the same imbalance that the cloud once created: vast efficiency for a few hyperscalers, and diminishing returns for everyone else.

Hammer’s new partnership signals where Europe’s AI infrastructure narrative is heading:  inward, toward on-premises sovereignty and efficiency over expansion. It’s a smart counterweight to the dominance of American cloud giants like Amazon Web Services, Google Cloud, and Microsoft Azure. Yet, even as Hammer and Liqid promise a “flexible, scalable, and secure” alternative, the question remains whether such architectures truly democratize access, or simply make high-performance computing a little less unsustainable for those who can already afford it.

Composable infrastructure is, at its heart, a technological compromise, an elegant rearrangement of limits. And while Liqid’s engineering deserves credit, the industry must not mistake optimization for salvation. The AI revolution doesn’t just need faster servers; it needs a reckoning with how much energy intelligence is worth.

Zeen is a next generation WordPress theme. It’s powerful, beautifully designed and comes with everything you need to engage your visitors and increase conversions.