BreezeML Raises $4M Seed to Automate and Virtualize Cloud AI Deployments

LOS ANGELES--()--BreezeML, the holistic management platform for resource-efficient Machine Learning (ML) with guarantees, today announced $4M in seed funding, led by BlueRun Venture (BRV) Aster with participation by Embark Ventures, UpHonest Capital, and Hat Trick Capital.

BreezeML helps organizations, large and small, through the entire life-cycle of deploying and managing ML training and inference jobs. Developers simply register jobs (including pre-existing ones) via any ML framework (e.g., PyTorch, TensorFlow), indicate a cloud they wish to run on (e.g., private clouds, Azure, EC2, GCP), and specify various goals and constraints regarding cost, inference latency, training throughput, etc. From there, BreezeML’s research-backed platform provisions resources, deploys jobs, and automatically scales jobs and manages clusters to meet developer requirements while keeping costs to a minimum. The funding will help them scale out their products and bolster contributions to the ML community.

On the one hand, BreezeML allows developers to focus on their applications, without worrying about infrastructural minutia and job management. On the other hand, it allows cloud managers to effectively schedule jobs to improve resource availability, without needing to see user code and job details. Key to this is their zero code change strategy which instills confidence in organizations: “Jobs run as developers wrote them, in the same frameworks they were written in,” said their CTO, Ravi Netravali. “ML jobs go through numerous rounds of fine-tuning to get things just right, and even small tweaks can yield noticeable effects on overall performance. To cope with this, we don’t ask developers to rewrite and re-optimize their jobs, and we certainly don’t make any edits to their core logic. Instead, we really operate under the hood, alongside orchestration frameworks like Kubernetes, and leave the core logic to the application developers.”

The platform centers on the ability to move both inference and training jobs around to adapt to environmental changes. The composition of available resources and jobs to run is always changing, but making those changes requires complex reasoning about the state of each job (to avoid excess work and ensure correctness). BreezeML’s core technology automates and enables rapid reconfiguration of jobs at any time, all the while ensuring that cost budgets and performance metrics are never violated. The effect is powerful: precious inference jobs can always be granted the resources they need to keep performance high in the face of fluctuating demands, while training jobs can continually be shuffled to make the best use of the scraps (keeping resource efficiency high).

“Building on our years of research conducted at UCLA and Princeton University,” said CEO Harry Xu, “BreezeML started off strongly on its journey towards democratizing AI/ML and making it accessible to the non-FAANG community.”

“As AI is making its way into all sectors of the industry, we are excited to see that BreezeML touches upon a huge pain point that every AI user is suffering from — the high infrastructure costs unaffordable to a lot of small and medium-sized companies,” commented Jimmy Shi, Venture Partner of BRV Aster, on their investment into BreezeML. “We also believe that their research-backed technologies are way beyond those available on the market, with great potentials to make ML productization truly an easy and breezy experience.”

Contacts

John Thorpe
john@breezeml.ai

Contacts

John Thorpe
john@breezeml.ai