(CTN News)a well-funded Apache Cassandra-centric database company, is betting heavily on artificial intelligence and its ability to provide highly scalable vector search capabilities to provide context to generative AI models in real time.
As of today, Astra DB’s hosted vector search capabilities are available to the general public following a short public preview.
In recent years, vector databases have emerged as a foundational technology for generative artificial intelligence. “I would not be able to understand why a database company would not make this a priority,” DataStax CPO Ed Anuff told me.
“This is the most exciting development in the history of databases.”
I find it to be extremely exciting. There is nothing better than a database. In addition to being foundational, they serve as sources of memory for artificial intelligence, which has completely changed the reason for getting up in the morning.
DataStax customers can now utilize Astra DB’s new vector search capabilities on AWS, Microsoft Azure, and Google Cloud Platform, where it was originally launched.
Users of DataStax Enterprises who run the service in their own data centers will have access to vector search within the next month.
Based on the nature of DataStax’s product,
Customers who use vector search also tend to be highly active users, according to Anuff.
He informed me that just a few days after the public preview was launched, the company received over 1,000 signups, and DataStax CEO Chet Kapoor said that the company had initiated 50 new enterprise POCs in the last week alone.
“I consider myself to be aggressive and forward-looking,” Kapoor said. “I was blown away by this. As a result, I am very surprised.
We provide database-as-a-service that goes into real-time artificial intelligence, and we are now almost ubiquitous in Pinecone and Chroma conversations. This is true not only for investors but also for customers and partners.”
It is not surprising that other database services are also trying to leverage this momentum, given the hype surrounding generative AI and the importance of vector search for augmenting these models with more recent or personalized data, for example.
DataStax’s core technology is based on Apache Cassandra, which allows it (and its database index) to reach the massive scale needed for many of these use cases, and its wide range of certifications provide a competitive edge.
Astra DB now supports the popular LangChain framework for building LLM-based applications, according to Anuff.
In order for enterprises to adopt generative AI models, the ability to trust their output will be critical,” said Matt Aslett, Ventana Research’s VP and research director.
As a result of vector embeddings and vector search being added to existing data platforms, organizations are able to augment generic models with enterprise information and data, thereby reducing concerns about accuracy and trust.