-

SambaNova Expands Deployment with SoftBank Corp. to Offer Fast AI Inference Across APAC

PALO ALTO, Calif.--(BUSINESS WIRE)--SambaNova today announces the expansion of its SambaNova Cloud deployment and partnership with SoftBank Corp. in Japan. SambaNova, in collaboration with SoftBank Corp., is adding racks equipped with SambaNova’s efficient AI chips to a new AI data center in Japan, providing developers in the region with fast inference services via SambaNova Cloud.

Developers will be given immediate access to the best Japanese open source model, Swallow, developed by Institute of Science Tokyo, as well as Meta’s Llama and Alibaba’s Qwen via SambaNova Cloud, increasing demand and footprint in the APAC region. In the next quarter, as more open source models are optimized for various Asian languages, they will also be released on SambaNova Cloud for developers in the region.

Hironobu Tamba, Vice President and Head of the Data Platform Strategy Division of the Technology Unit at SoftBank Corp., stated: “SoftBank Corp. is actively engaged in various initiatives to enhance our generative AI development capabilities. Hosting SambaNova Cloud, which offers ultra-fast AI inference services, in SoftBank Corp’s AI data center is part of this commitment. Moving forward, we look forward to strengthening our collaboration with SambaNova to further advance generative AI initiatives.”

Rodrigo Liang, co-founder and CEO of SambaNova Systems, said: “The deal being announced today is an expansion of our current partnership with SoftBank Corp., showcasing SambaNova’s performance advantage for fast inference. This partnership means more developers in APAC can produce discoveries that accelerate and impact AI initiatives in the region. We are pleased to build upon our longstanding partnership with SoftBank Corp. and this new system deployment.”

With its extensive model offerings, ultra-fast inference speed, and an easy-to-use API interface, SambaNova Cloud is designed to be the AI inference service of choice for developers: “We are thrilled to provide developers across APAC with access to low-latency AI inference services like never before, empowering them to build solutions for the era of Agentic AI,” stated Liang.

About SambaNova

Customers turn to SambaNova to quickly deploy state-of-the-art generative AI capabilities within the enterprise. Our purpose-built enterprise-scale AI platform is the technology backbone for the next generation of AI computing.

Headquartered in Palo Alto, California, SambaNova Systems was founded in 2017 by industry luminaries, and hardware and software design experts from Sun/Oracle and Stanford University. Investors include SoftBank Vision Fund 2, funds and accounts managed by BlackRock, Intel Capital, GV, Walden International, Temasek, GIC, Redline Capital, Atlantic Bridge Ventures, Celesta, and several others. Visit us at sambanova.ai or contact us at info@sambanova.ai. Follow SambaNova Systems on Linkedin or X.

Contacts

Press Contacts:
Virginia Jamieson, Head of External Communications, SambaNova
virginia.jamieson@sambanova.ai

Kenichi Hayashi, Field Marketing Director, SambaNova
kenichi.hayashi@sambanova.ai

SambaNova


Release Versions

Contacts

Press Contacts:
Virginia Jamieson, Head of External Communications, SambaNova
virginia.jamieson@sambanova.ai

Kenichi Hayashi, Field Marketing Director, SambaNova
kenichi.hayashi@sambanova.ai

Social Media Profiles
More News From SambaNova

SambaNova and TEPCO Systems Partner to Deliver Energy-Efficient AI Infrastructure to Japan’s Power Sector

SAN JOSE, Calif.--(BUSINESS WIRE)--SambaNova, a leader in next‑generation AI infrastructure, announces that TEPCO Systems Corporation (“TEPCO Systems”), the digital transformation arm of Tokyo Electric Power Company Holdings, Incorporated (“TEPCO Group”), has signed a distributor agreement to bring SambaNova’s energy‑efficient, high‑performance AI infrastructure to enterprises across Japan. Under the agreement, TEPCO Systems will also deploy SambaNova’s AI infrastructure as the foundation for t...

SambaNova and Intel Announce Blueprint for Heterogeneous Inference: GPUs for Prefill, SambaNova RDUs for Decode, and Intel® Xeon® 6 CPUs for Agentic Tools

SAN JOSE, Calif.--(BUSINESS WIRE)--SambaNova today announced the next phase of its collaboration with Intel: a heterogeneous hardware solution that combines GPUs for prefill, Intel Xeon® 6 processors as both host and “action” CPUs, and SambaNova RDUs for decode to deliver premium inference for the most demanding Agentic AI applications. The design will be made available in H2 2026 to enterprises, cloud providers, and sovereign AI programs that want to run coding agents and other agentic workloa...

Research Finds AI’s Energy Use Is Driving Concern

BARCELONA, Spain--(BUSINESS WIRE)--FYFN – SambaNova, builders of the fastest chip for agentic AI, today released research highlighting the mounting concerns over the energy demands of AI data centres and the impact on households and national power grids. As AI deployment accelerates, business leaders and consumers are aware that legacy, GPU-based infrastructure is not built for the efficiency and scale required in a power‑constrained world. The survey of 2,525 adults across the US and UK shows...
Back to Newsroom