-

Robbyant Open-Sources LingBot-VLA as a “Universal Brain” for Robots

SHANGHAI--(BUSINESS WIRE)--Robbyant, an embodied AI company within Ant Group, today announced the open-source release of LingBot-VLA, a vision-language-action (VLA) model designed to serve as a “universal brain” for real-world robotics, which helps reduce post-training costs and accelerate the path to scalable deployment.

So far, LingBot-VLA has been successfully adapted to robots from leading manufacturers, including Galaxea Dynamics and AgileX Robotics, demonstrating strong cross-morphology transfer capabilities across diverse robot platforms.

The model’s performance was evaluated on the GM-100 benchmark, a comprehensive evaluation suite open-sourced by Shanghai Jiao Tong University that comprises 100 real-world tasks. In tests conducted across three distinct physical robot platforms, LingBot-VLA achieved higher task success rates than other evaluated models. Notably, when depth information was included, the model’s spatial perception improved significantly, setting a new record on task success rate.

Additionally, on the RoboTwin 2.0 simulation benchmark, which features 50 challenging tasks under intense environmental randomization, including varying lighting, clutter, and height perturbations, LingBot-VLA leveraged its learnable query alignment mechanism to integrate depth cues effectively and achieved a higher task success rate in complex scenarios, demonstrating robust performance on both simulation and real-world deployment.

To date, the deployment of embodied AI has been hampered by cross-platform generalization challenges stemming from differences in robot morphology, task definitions and operating environments. Developers are often forced to repeatedly collect data, retrain models, and fine-tune parameters for each new deployment, leading to high costs, low reusability, and limited scalability.

To address these challenges, LingBot-VLA was pre-trained on over 20,000 hours of large-scale real-world interaction data, covering nine mainstream dual-arm robot configurations, including AgileX, Galaxea R1Pro, RILite, and AgiBot G1. This enables a single model, or a universal brain, to be deployed across a wide range of robotic morphologies, including single-arm, dual-arm, and humanoid platforms, while maintaining high success rates and robustness despite variations in tasks, environments, or hardware configurations.

Beyond generalization, LingBot-VLA also demonstrates strong data and computational efficiency. With comprehensive optimizations to its underlying codebase, LingBot-VLA achieves a 1.5x to 2.8x improvement in training speed compared with other frameworks such as StarVLA and OpenPI.

Notably, this open-source release includes not only the model weights but also a complete, production-ready codebase, featuring tools for data processing, efficient fine-tuning, and automated evaluation. This toolchain can help shorten training cycles and reduces both compute requirements and time cost to commercial deployment, allowing developers to rapidly adapt LingBot-VLA to their own robots and use cases with minimal overhead.

Zhu Xing, CEO of Robbyant, said: “For embodied intelligence to achieve large-scale adoption, we need highly capable and cost-effective foundation models that work reliably on real hardware. With LingBot-VLA, we aim to push the limits of reusable, verifiable, and scalable embodied AI for real-world deployment. Our goal is to accelerate the integration of AI into the physical world so it can serve everyone sooner.”

“LingBot-VLA is Ant Group’s first open-source embodied AI model and marks another milestone in our efforts toward Artificial General Intelligence (AGI),” Zhu added. “Ant Group is committed to advancing AGI through an open and collaborative approach. To this end, we’ve launched InclusionAI, a comprehensive technological ecosystem spanning foundational models, multimodal intelligence, reasoning, novel architectures, and embodied AI. The open-sourcing of LingBot-VLA is a key step in this initiative. We look forward to working with developers worldwide to accelerate the development and large-scale adoption of embodied intelligence and help advance progress toward AGI.”

The announcement was made as part of Robbyant’s “Evolution of Embodied AI Week” initiative. On January 27, Robbyant unveiled LingBot-Depth, a high-precision spatial perception model. When paired with LingBot-Depth, LingBot-VLA can leverage higher-quality depth representations, effectively upgrading the system’s “vision” and enabling robots to “see more clearly and act more intelligently”.

To learn more about LingBot-VLA, please visit:

About Robbyant

Robbyant is an embodied intelligence company within Ant Group, dedicated to advancing embodied intelligence through cutting-edge software and hardware technologies. Robbyant independently develops foundational large models for embodied AI and actively explores next-generation intelligent devices, aiming to create robotic companions and caregivers that truly understand and enhance people’s everyday lives and deliver reliable intelligent services across key use cases, such as elderly care, medical assistance, and household tasks.

To learn more about Robbyant, please visit: www.robbyant.com

Contacts

Media Inquiries
Vick Li Wei
Ant Group
vick.lw@antgroup.com

Ant Group


Release Versions

Contacts

Media Inquiries
Vick Li Wei
Ant Group
vick.lw@antgroup.com

Social Media Profiles
More News From Ant Group

Singapore Tourism Board and Ant International Deepen Partnership to Accelerate Tourism Growth Through Travel Innovation

SINGAPORE--(BUSINESS WIRE)--The Singapore Tourism Board (STB) and Ant International, renewed their multi-year strategic partnership to deepen tourism-led economic impact by strengthening Singapore’s position as a world-class destination and delivering seamless digital experiences for global travellers through Alipay+, Ant International’s unified wallet gateway. Building on the partnership which began in 2018, STB and Ant International will: Amplify Singapore’s destination appeal amongst key mar...

Ant Group’s Alipay AI Pay and AI Health App AQ Each Surpass 100 Million Users During CNY as AI Adoption Accelerates in China

HANGZHOU, China--(BUSINESS WIRE)--As AI adoption gained momentum during the 2026 Chinese New Year, Ant Group announced today that both Alipay AI Pay and its AI health app AQ have each surpassed the 100 million user milestone. AI Payment Adoption Accelerates amid CNY AI Shopping Boom From ordering bubble tea and coffee to buying movie tickets, Chinese consumers embraced AI-powered services in everyday scenarios during this year’s holiday, driving a surge in Alipay AI Pay usage. Alipay AI Pay has...

Ant Group Releases Ling-2.5-1T and Ring-2.5-1T, Evolving Its Open-Source AI Model Family

HANGZHOU, China--(BUSINESS WIRE)--Ant Group today announced the release of Ling-2.5-1T, its newest trillion-parameter large language model, and Ring-2.5-1T, the world’s first hybrid linear-architecture thinking model. Both models represent the latest evolution of the Ling 2.0 series unveiled in October 2025, and are now available under open licenses on Hugging Face and ModelScope. Ling-2.5-1T is the latest flagship in Ant Group’s Ling model series. It is designed to deliver higher reasoning eff...
Back to Newsroom