Timescale Enhances PostgreSQL for AI with New Open-Source Extensions

Apps & Games / Desktop / Timescale Enhances PostgreSQL for AI with New Open-Source Extensions
06 Jul 2024

Innovations Driving AI Application Performance and Developer Productivity

Two new open-source extensions from Timescale are set to revolutionize PostgreSQL for AI applications, unlocking large-scale, high-performance AI use cases previously only achievable with specialized vector databases like Pinecone. The first extension, pgvectorscale, enables developers to build more scalable AI applications with higher performance embedding search and cost-efficient storage. Complementing the popular open-source extension pgvector, pgvectorscale introduces two key innovations: a StreamingDiskANN index (adapted from Microsoft research) and Statistical Binary Quantization (developed by Timescale researchers, improving on standard Binary Quantization techniques).

Timescale’s benchmarks reveal that with pgvectorscale, PostgreSQL achieves 28x lower p95 latency and 16x higher query throughput compared to Pinecone for approximate nearest neighbor queries at 99% recall. Unlike pgvector, which is written in C, pgvectorscale is developed in the Rust programming language, offering the PostgreSQL community a new avenue for contributing to vector support.

The second extension, pgai, brings more AI workflows to PostgreSQL, making it easier for developers to build search and retrieval augmented generation (RAG) applications. The initial release supports creating OpenAI embeddings and obtaining OpenAI chat completions from models like GPT4o directly within PostgreSQL. This integration allows for classification, summarization, and data enrichment tasks on existing relational data, streamlining the development process from proof of concept to production.

“Pgvectorscale and pgai are incredibly exciting for building AI applications with PostgreSQL. Having embedding functions directly within the database is a huge bonus,” states Web Begole, CTO of Market Reader, a company using Timescale’s cloud PostgreSQL offering to build an AI-enabled financial information platform. “Previously, updating our saved embeddings was a tedious task, but now, with everything integrated, it promises to be much simpler and more efficient. This will save us a significant amount of time and effort.”

“Pgvectorscale and pgai are great additions to the PostgreSQL AI ecosystem. The introduction of Statistical Binary Quantization promises lightning performance for vector search and will be valuable as we scale our vector workload,” said John McBride, Head of Infrastructure at OpenSauced, a company building an AI-enabled analytics platform for open-source projects using Timescale’s cloud PostgreSQL offering. “Pgai removes the need for developers to re-implement common functionality themselves, and I’m excited for the use cases it enables.”

Challenging the Need for Standalone Specialized Vector Databases

The primary advantage of dedicated vector databases like Pinecone has been their performance, coming from purpose-built architectures and algorithms for storing and searching large volumes of vector data. However, Timescale’s pgvectorscale challenges this notion by bringing such specialized architectures and algorithms to PostgreSQL in the form of an extension, helping the popular general-purpose database deliver comparable and often superior performance than specialized vector databases.

According to Timescale’s benchmarks, which involved querying a dataset of 50 million Cohere embeddings (768 dimension), PostgreSQL outperforms Pinecone’s storage optimized index (s1) with 28x lower p95 latency and 16x higher query throughput for approximate nearest neighbor queries at 99% recall. Furthermore, PostgreSQL with pgvectorscale achieves 1.4x lower p95 latency and 1.5x higher query throughput compared to Pinecone’s performance optimized index (p2) at 90% recall on the same dataset.

The cost benefits are equally compelling.

Update: 06 Jul 2024