Kinara Edge AI Processor Tackles the Monstrous Compute Demands of Generative AI and Transformer-Based Models

New Ara-2 second-generation processor, built around the same flexible and efficient architecture as Ara-1, boasts tremendously increased performance/Watt and performance/$

The Kinara Ara-2 edge AI processor is targeted at applications running traditional AI models and state-of-the-art AI models with transformer-based architectures. (Photo: Business Wire)

LOS ALTOS, Calif.--()--Kinara™, Inc., today launched the Kinara Ara-2 Edge AI processor, powering edge servers and laptops with high performance, cost effective, and energy efficient inference to run applications such as video analytics, Large Language Models (LLMs), and other Generative AI models. The Ara-2 is also ideal for edge applications running traditional AI models and state-of-the-art AI models with transformer-based architectures. With an experientially enhanced feature set and more than 5-8 times the performance of its first-generation Ara-1 processor, Kinara’s Ara-2 combines real-time responsiveness with high throughput, merging its proven latency optimized design with perfectly balanced on-chip memories and high off-chip bandwidth to execute very large models with extremely low latency.

LLMs and Generative AI in general have become incredibly popular, but most of the associated applications are running on GPUs in data centers and are burdened with high latency, high cost, and questionable privacy. To overcome these limitations and put the compute literally in the hands of the user, Kinara’s Ara-2 simplifies the transition to the edge with its support for the 10’s of billions of parameters used by these Generative AI models. Furthermore, to seamlessly facilitate the migration from expensive GPUs for a wide variety of AI models, the compute engines in Ara-2 and the associated software development kit (SDK), are specifically designed to support high-accuracy quantization, a dynamically moderated host runtime, and direct FP32 support.

“With Ara-2 added to our family of processors, we can better provide customers with performance and cost options to meet their requirements. For example, Ara-1 is the right solution for smart cameras as well as edge AI appliances with 2-8 video streams, whereas Ara-2 is strongly suited for handling 16-32+ video streams fed into edge servers, as well as laptops, and even high-end cameras,” said Ravi Annavajjhala, Kinara’s CEO. “The Ara-2 enables better object detection, recognition, and tracking by using its advanced compute engines to process higher resolution images more quickly and with significantly higher accuracy. And as an example of its capabilities for processing Generative AI models, Ara-2 can hit 10 seconds per image for Stable Diffusion and tens of tokens/sec for LLaMA-7B."

In October, Ampere welcomed Kinara into the AI Platform Alliance with the primary goal of reducing system complexity and promoting better collaboration and openness with AI solutions and ultimately delivering better total performance and increased power and cost efficiency than GPUs. Ampere’s Chief Evangelist Sean Varley said, “The performance and feature set of Kinara’s Ara-2 is a step in the right direction to help us bring better AI alternatives to the industry than the GPU-based status quo.”

The Ara-2 also offers secure boot, encrypted memory access, and a secure host interface to enable enterprise AI deployments with even greater security. Kinara also supports Ara-2 with a comprehensive SDK that includes a model compiler and compute-unit scheduler, flexible quantization options that include the integrated Kinara quantizer as well as support for pre-quantized PyTorch and TFLite models, a load balancer for multi-chip systems, and a dynamically moderated host runtime.

Ara-2 is available as a stand-alone device, a USB module, an M.2 module, and a PCIe card featuring multiple Ara-2’s. Kinara will show a live demo with Ara-2 at CES. Contact Kinara to set up your appointment in our hospitality suite at the Venetian Hotel on January 9-11, 2024.

About Kinara

Kinara provides the world’s most power- and price-efficient Edge AI inference platform supported by comprehensive AI software development tools. Enabling smart applications across retail, medical, industry 4.0, automotive, and smart cities, Kinara’s AI processors, modules, and software can be found at the heart of the AI industry’s most exciting and influential innovations. Kinara envisions a world of exceptional customer experiences, better manufacturing efficiency, and greater safety for all. Learn more at https://kinara.ai/

All registered trademarks and other trademarks belong to their respective owners.

Contacts

Napier Partnership:
Nesbert Musuwo, Account Manager, Napier B2B
Email Address: Nesbert@Napierb2b.com

Contacts

Napier Partnership:
Nesbert Musuwo, Account Manager, Napier B2B
Email Address: Nesbert@Napierb2b.com