-

GigaOm Sonar for Vector Databases Positions Vespa as a Leader and Forward Mover For the Second Consecutive Year

TRONDHEIM, Norway--(BUSINESS WIRE)--Vespa.ai, the creator of a leading platform for building and deploying large-scale, real-time AI applications powered by big data, today announced its recognition as a Leader and Forward Mover in the 2025 GigaOm Sonar for Vector Databases—for the second consecutive year.

The GigaOm report underscores Vespa’s leadership in enabling fast, scalable AI applications. It highlights Vespa’s innovative methods for processing text and structured data, which empower organizations to efficiently search and index vast amounts of data. Vespa offers advanced support for technologies like real-time vector search and binary data processing, delivering unmatched flexibility and cost-efficiency. Vespa Cloud enhances these capabilities by offering pre-built tools and seamless data integration, enabling businesses to unlock deeper insights and provide smarter, faster user experiences.

Andrew Brust, Analyst, GigaOm: “Vespa’s low-latency engine can handle hundreds of thousands of requests per second and is designed for online use cases that involve AI and data. It’s a comprehensive offering in which users define and index data with fields composed of vectors, tensors, unstructured text, and structured data to query across them seamlessly.”

Jon Bratseth, CEO and Founder, Vespa: “We are pleased to be recognized as a leader in this rapidly growing and highly relevant market for the second consecutive year. The GigaOm Sonar report provides valuable insights into the role of vector databases as part of a broader AI solution rather than as standalone technology. This perspective aligns perfectly with our vision for Vespa as a comprehensive platform for building AI applications, seamlessly integrating vector database capabilities and beyond.”

The GigaOm Sonar for Vector Databases can be downloaded here: https://content.vespa.ai/gigaom-report-2025

About Vespa

Vespa.ai is a powerful platform for developing real-time search-based AI applications. Once built, these applications are deployed through Vespa’s large-scale, distributed architecture, which efficiently manages data, inference, and logic for applications handling large datasets and high concurrent query rates. Vespa delivers all the building blocks of an AI application, including vector database, hybrid search, retrieval augmented generation (RAG), natural language processing (NLP), machine learning, and support for large language models (LLM) and vision language models (VLM). It is available as a managed service and open source.

Contacts

Media Contact
Tim Young
timyoung@vespa.ai

More News From Vespa.ai

GigaOm Radar for Vector Databases v3 Positions Vespa.ai as a Leader and Outperformer

TRONDHEIM, Norway--(BUSINESS WIRE)--Vespa.ai, the creator of the AI Search Platform for building and deploying large-scale, real-time AI applications powered by big data, today announced its recognition as a Leader and Outperformer in the GigaOm Radar for Vector Databases v3, marking the company’s third consecutive year being evaluated in GigaOm’s vector database research. Now in its third edition, the report compares 17 leading open source and commercial solutions using GigaOm’s structured eva...

Harini Gopalakrishnan Joins Vespa.ai as General Manager of Health & Life Sciences

TRONDHEIM, Norway--(BUSINESS WIRE)--Vespa.ai, the platform for large-scale, real-time AI applications powered by big data, today announced the appointment of Harini Gopalakrishnan as General Manager of Health & Life Sciences. Gopalakrishnan, formerly Global CTO for Life Sciences at Snowflake, will spearhead Vespa’s strategic initiatives in the sector. Vespa.ai provides the infrastructure to build and deploy AI-driven applications for search and retrieval-augmented generation (RAG), offering...

Perplexity Partners With Vespa.ai to Bring its Search Function In-House

TRONDHEIM, Norway--(BUSINESS WIRE)--Today, Vespa.ai– the company behind the leading platform to build and deploy large-scale, real-time AI applications powered by big data– joined Perplexity to announce the AI-powered answer engine’s shift bringing its search feature in-house. The move will significantly enhance the speed, accuracy, and relevance of search results at a scale only made possible on Vespa’s platform. “The recipe: 1. Solve Search. 2. Use it to solve everything else,” said Aravind S...
Back to Newsroom