-

DataStax Launches New Integration with LangChain, Enables Developers to Easily Build Production-ready Generative AI Applications

Support for Astra DB Vector Database and Apache Cassandra Now Available Out-of-the-Box for Any LangChain User

SANTA CLARA, Calif.--(BUSINESS WIRE)--DataStax, the company that powers generative AI applications with real-time, scalable data, today announced a new integration with LangChain, the most popular orchestration framework for developing applications with large language models (LLMs). The integration makes it easy to add Astra DB – the real-time database for developers building production Gen AI applications – or Apache Cassandra®, as a new vector source in the LangChain framework.

As many companies implement retrieval augmented generation (RAG) – the process of providing context from outside data sources to deliver more accurate LLM query responses – into their generative AI applications, they require a vector store that gives them real-time updates with zero latency on critical, real-life production workloads.

Generative AI applications built with RAG stacks require a vector-enabled database and an orchestration framework like LangChain, to provide memory or context to LLMs for accurate and relevant answers. Developers use LangChain as the leading AI-first toolkit to connect their application to different data sources.

The new integration lets developers leverage the power of the Astra DB vector database for their LLM, AI assistant, and real-time generative AI projects through the LangChain plugin architecture for vector stores. Together, Astra DB and LangChain help developers to take advantage of framework features like vector similarity search, semantic caching, term-based search, LLM-response caching, and data injection from Astra DB (or Cassandra) into prompt templates.

“In a RAG application, the model receives supplementary data or context from various sources — most often a database that can store vectors,” said Harrison Chase, CEO, LangChain. “Building a generative AI app requires a robust, powerful database, and we ensure our users have access to the best options on the market via our simple plugin architecture. With integrations like DataStax's LangChain connector, incorporating Astra DB or Apache Cassandra as a vector store becomes a seamless and intuitive process.”

“Developers at startups and enterprises alike are using LangChain to build generative AI apps, so a deep native integration is a must-have,” said Ed Anuff, CPO, DataStax. “The ability for developers to easily use Astra DB as their vector database of choice, directly from LangChain, streamlines the process of building the personalized AI applications that companies need. In fact, we’re already seeing customers benefit from our joint technologies as healthcare AI company, Skypoint, is using Astra DB and LangChain to power its generative AI healthcare model.”

To learn more, join the live webinar on October 26 at 9am PT, where LangChain founder and CEO, Harrison Chase, and SkyPoint founder and CEO, Tisson Mathew, discuss their experience building production RAG applications.

Additional Resources

About DataStax

DataStax is the company that powers generative AI applications with real-time, scalable data with production-ready vector data tools that generative AI applications need, and seamless integration with developers’ stacks of choice. The Astra DB vector database provides developers with elegant APIs, powerful real-time data pipelines, and complete ecosystem integrations to quickly build and deploy production-level AI applications. With DataStax, any enterprise can mobilize real-time data to quickly build smart, high-growth AI applications at unlimited scale, on any cloud. Hundreds of the world’s leading enterprises, including Audi, Bud Financial, Capital One, SkyPoint Cloud, Verizon, VerSe Innovation, and many more rely on DataStax to deliver real-time AI. Learn more at DataStax.com.

Apache, Apache Cassandra, and Cassandra, are either registered trademarks or trademarks of the Apache Software Foundation or its subsidiaries in Canada, the United States, and/or other countries.

Contacts

Regan Schiappa
press@datastax.com

DataStax


Release Versions

Contacts

Regan Schiappa
press@datastax.com

More News From DataStax

Bud Financial Uses DataStax AI and NVIDIA to Drive Real-Time Financial Insights for ANZ

SANTA CLARA, Calif.--(BUSINESS WIRE)--DataStax, a leading AI platform that helps enterprises and developers build accurate AI applications at scale, today announced that Bud Financial is leveraging the DataStax AI Platform, built with NVIDIA AI, including NVIDIA NeMo Retriever, NVIDIA NIM microservices, and NVIDIA AI Enterprise, to enhance customer experiences for organizations such as ANZ while increasing speed 10x. DataStax and NVIDIA AI drive both internal and external efficiency, reduce cos...

DataStax Introduces Astra DB Hybrid Search, Boosting AI Search Relevance by 45%

SANTA CLARA, Calif.--(BUSINESS WIRE)--DataStax, a leading AI platform, today announced Astra DB Hybrid Search, a breakthrough capability that significantly enhances retrieval-augmented generation (RAG) systems by improving search relevance by 45%. Accelerated by the NVIDIA NeMo Retriever reranking microservices, part of NVIDIA AI Enterprise, Astra DB Hybrid Search seamlessly integrates vector search and lexical search to deliver highly accurate, AI-driven search and recommendation experiences....

Wikimedia Deutschland Launches AI Knowledge Project in Collaboration with DataStax Built with NVIDIA AI

SANTA CLARA, Calif.--(BUSINESS WIRE)--DataStax announced that Wikimedia Deutschland is leveraging the DataStax AI Platform, built with NVIDIA AI, to make Wikidata available to developers....
Back to Newsroom