SANTA CLARA, Calif.--(BUSINESS WIRE)--Today, d-Matrix, a leader in high-efficiency AI-compute and inference, announced a collaboration with Microsoft using its low-code reinforcement learning (RL) platform, Project Bonsai, to enable an AI-trained compiler for d-Matrix’s unique digital in memory compute (DIMC) products. The user-friendly Project Bonsai platform accelerates time to value, with a product-ready solution that cuts down on development efforts using an AI-based compiler that leverages ultra-efficient DIMC technology from d-Matrix.
With large transformer models driving expanding demand for AI inference, while memory and energy requirements hit threshold limits, d-Matrix is bringing one of the first DIMC-based inference compute platforms to market. d-Matrix transforms the economics of complex transformers and Generative AI with a scalable platform built to handle the immense data and power requirements of inference AI, making energy-hungry data centers more efficient. This novel AI compute platform from d-Matrix uses an ingenious combination of intelligent ML tools and integrated software architectures utilizing chiplets in a Lego block grid formation, which enables the integration of multiple programming engines in a common package.
Combining d-Matrix technology with Project Bonsai enables the efficient creation of a compiler for the DIMC platform. Project Bonsai accelerates rapid prototyping, testing and deploying of trained RL agents in the compiler stack to take full advantage of low power, AI inference technology from d-Matrix that can deliver up to ten times the power efficiency of older architectures.
“d-Matrix has built the world’s most efficient computing platform for AI inference at scale,” said Sudeep Bhoja, Co-Founder, CTO at d-Matrix. “What made us gravitate towards Project Bonsai is its product-first features and ease of use. Microsoft’s unique offering is built around machine teaching and the Inkling language, which makes RL constructs fully explainable.”
The RL based compiler is expected to become a key differentiator of d-Matrix’s first generation DIMC product offering, CORSAIR, on track to ship in late 2023.
“We have been working together developing the RL based compiler,” said Kingsuk Maitra, Principal Applied AI Engineer at Microsoft, with the Project Bonsai team. “We made it a point to have a product mindset from the get-go. Embodiments including the instruction set architecture have been vetted and validated on two d-Matrix test chips, NightHawk and JayHawk, and embedded into the RL training environment. Project Bonsai’s low code attributes made early development work easy, and the ability to integrate statistical control parameters and make integration of other real life chip design constraints simpler, with very promising results so far.”
d-Matrix is building a new way of doing datacenter AI inferencing at scale using in-memory computing (IMC) techniques with chiplet level scale-out interconnects. Founded in 2019, d-Matrix has attacked the physics of memory-compute integration using innovative circuit techniques, ML tools, software and algorithms; solving the memory-compute integration problem, which is the final frontier in AI compute efficiency. Learn more at dmatrix.ai