Intel Vision 2022: Liqid, Intel, and IntelliProp Deliver Dynamic Memory with New Compute Express Link™ Demonstration

Using CXL-enabled Fabric Adapters and Fabric-attached Memory, the Companies Showcase DRAM Pooling, Sharing, and Composability

DALLAS--()--Liqid, the world’s leading software company delivering data center composability, today announced continued success in the area of composable memory with another Liqid Matrix software demonstration of DRAM memory attached via the Compute Express Link™ (CXL™) protocol. Liqid worked with chip technology leader Intel and intelligent memory and storage solutions provider IntelliProp, utilizing CXL to compose DRAM to hosts via software in tandem with other data center elements such as GPU, FPGA, and NVMe storage. With the ability to disaggregate memory from the CPU for the first time, CXL promises the kind of performance and flexibility necessary to address the crushing data requirements of artificial intelligence and machine learning (AI+ML), and other high-value applications increasingly weaving their way into every element of the connected world.

“Composable memory via CXL allows us to disaggregate the last remaining component from the CPU, enabling new levels of efficiency, flexibility, and agility for memory intensive applications, just like we have been doing for GPU, NVMe storage, and FPGA for years,” said Ben Bolles, Executive Director, Product Management, Liqid. “We are pleased to work with industry leaders like Intel and the groundbreaking innovators at IntelliProp to pioneer new levels of performance and flexibility with these early CXL proofs of concept as the need to compose DRAM in tandem with other data center elements becomes critically important to the day-to-day operations of industry, government, academia, and wide range of other data-centric verticals.”

Based on the coming PCIe Gen-5 standard, CXL delivers high-speed CPU-to-memory connections, allowing for the decoupling of DRAM from the CPU, the final hardware element to be disaggregated. CXL is a high-speed interconnect delivering sufficient performance to support the pooling and sharing of emerging memory solutions based on the DDR5 memory standard, which significantly increases the density and bandwidth of DRAM while reducing power consumption requirements, thus increasing overall data center efficiency.

As a wave of CXL-supported servers becomes commercially available, composability incorporated into any server refresh will both enable existing resources to continue to be utilized, while also deploying DRAM as a shared, pooled, bare-metal resource that can be deployed in tandem with accelerator technologies already available to Liqid Matrix™ composable disaggregated infrastructure (CDI) software. With native support for CXL, Liqid Matrix software can now pool and compose DDR5 memory in tandem with GPU, NVMe, persistent memory, FPGA, and other accelerator devices.

Intel is a pioneer in DDR5 memory and a founding member of the CXL consortium. Servers based on 4th Gen Intel Xeon Scalable processors, formerly codenamed Sapphire Rapids, feature integrated support for CXL, DDR5, and PCIe Gen-5 for significant performance improvements across a range of workloads and with new Intel Advanced Matrix Extensions (AMX) that achieve up to an 8x performance increase for AI inference1[2].

“New industry standards such as CXL enable new memory and server architectures which address the evolving requirements of modern computing,” said Jim Pappas, Director, Technology Initiatives, Intel. “We applaud the investment of companies like Liqid and IntelliProp which provide new proofs of concept that give IT professionals an early look at the potential of compute ecosystems based on CXL technology.”

The IntelliProp Deep Learning Memory Mesh is created using CXL fabric adapters and fabric attached memory which allows sharing of machine learning training data across multiple CPU nodes reducing cost and power while improving the scale and overall performance of the Deep Learning applications.

“With the advent of CXL, advancements in memory and server technologies from industry stalwarts like Intel, and composable software from Liqid, we finally have the ability to use fabric-attached memory to provide large and multi-server accessible memory pools when, where, and at the capacity that is required,” said Hiren Patel, CEO, IntelliProp. “We’re excited to offer this joint demonstration of the power of CXL with Intel and Liqid, and to work with leaders in data center innovation as we finally approach the kind of memory-intensive compute that will support and accelerate advancements in higher-order applications that are being driven by AI+ML.”

To learn more about joint composable memory solutions from Liqid and IntelliProp schedule an appointment with an authorized Liqid representative or reach out at sales@liqid.com. Follow Liqid on Twitter and LinkedIn to stay up to date with the latest Liqid news and industry insights.

Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.

About Liqid

Liqid’s composable infrastructure software platform, Liqid Matrix™, unlocks cloud-like speed and flexibility plus higher efficiency from data center infrastructure. Now IT can configure, deploy, and scale physical, bare-metal servers in seconds, then reallocate valuable accelerator and storage resources via software as needs evolve. Dynamically provision previously impossible systems or scale existing investments, and then redeploy resources where needed in real-time. Unlock cloud-like datacenter agility at any scale and experience new levels of resource and operational efficiency with Liqid.

 

Contacts

Robert Brumfield
Sr. Director, Communications
Liqid
917 224 7769
brumfield.bob@liqid.com