Supavec MCP Server: Vector Database for AI Applications
Supavec MCP servers enable AI models to interact with vector databases, providing capabilities for storing, searching, and managing vector embeddings for AI applications.
Overview
The Supavec MCP Server bridges AI models with vector databases, allowing for efficient storage, retrieval, and management of high-dimensional vector embeddings. This is crucial for AI applications such as Retrieval Augmented Generation (RAG), semantic search, and recommendation systems, enabling AI to understand and interact with data based on its meaning and context. For more details, explore its Key Features, learn about Available Tools, and find Installation instructions. You can also see Common Use Cases.
Official Server:
Developed and maintained by Model Context Protocol
Key Features
Vector Storage & Indexing
Efficiently store and index high-dimensional vector embeddings for fast retrieval.
Semantic Search
Perform similarity searches using natural language queries or vector inputs to find relevant data.
Metadata Filtering
Refine search results by applying filters based on associated metadata with your vectors.
Scalability
Designed to handle and scale with large-scale vector datasets and high query loads.
Available Tools
Quick Reference
| Tool | Purpose | Category |
|---|---|---|
supavec_upsert | Insert or update vectors with metadata | Write |
supavec_query | Perform a similarity search | Read |
supavec_delete | Delete vectors by ID or query | Write |
supavec_describe_index | Get index schema information | Discovery |
Detailed Usage
supavec_upsert▶
Insert new vectors or update existing ones in the Supavec index, optionally with associated metadata.
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_upsert",
arguments: {
vectors: [
{
id: "doc1",
values: [0.1, 0.2, 0.3, ...],
metadata: { "author": "John Doe", "year": 2023 }
},
{
id: "doc2",
values: [0.4, 0.5, 0.6, ...],
metadata: { "category": "AI", "status": "published" }
}
]
}
});
Returns the count of upserted vectors.
supavec_query▶
Perform a similarity search against the Supavec index using a query vector, with options for top-k results and metadata filtering.
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_query",
arguments: {
query_vector: [0.1, 0.1, 0.1, ...],
top_k: 5,
filter: { "year": { "$eq": 2023 } }
}
});
Returns a list of matching vectors and their similarity scores.
supavec_delete▶
Delete vectors from the Supavec index by their IDs or by specifying a metadata filter.
// Delete a single vector by ID
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_delete",
arguments: {
ids: ["doc1"]
}
});
// Delete multiple vectors by metadata filter
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_delete",
arguments: {
filter: { "status": { "$eq": "draft" } }
}
});
Returns the number of deleted vectors.
supavec_describe_index▶
Retrieve information about the Supavec index, including its dimensions, vector count, and configured metadata fields.
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_describe_index",
arguments: {}
});
Returns a JSON object with index details.
Installation
{
"mcpServers": {
"supavec": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"mcp/supavec",
"supavec://host.docker.internal:5000?api_key=YOUR_API_KEY"
]
}
}
}
Docker Networking:
Use host.docker.internal instead of localhost when connecting from Docker to Supavec on your host machine. Replace YOUR_API_KEY with your actual Supavec API key.
Common Use Cases
1. Retrieval Augmented Generation (RAG)
Enhance LLM responses by retrieving relevant context from your vector database:
// Query Supavec for relevant documents based on a user's question
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_query",
arguments: {
query_vector: [0.7, 0.2, 0.9, ...], // Embedding of user's question
top_k: 3,
filter: { "document_type": { "$eq": "knowledge_base" } }
}
});
2. Semantic Search
Implement powerful search functionalities that understand the meaning behind queries:
// Search for products semantically similar to a user's input
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_query",
arguments: {
query_vector: [0.3, 0.8, 0.1, ...], // Embedding of product description
top_k: 10,
filter: { "availability": { "$eq": "in_stock" } }
}
});
3. Recommendation Systems
Suggest items or content based on user preferences or item similarities:
// Recommend articles similar to what the user has read
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_query",
arguments: {
query_vector: [0.6, 0.4, 0.5, ...], // Embedding of user's read history
top_k: 5,
filter: { "category": { "$ne": "read" } }
}
});
4. Anomaly Detection
Identify unusual data points by comparing their vector embeddings to a baseline:
// Detect anomalous user behavior based on activity embeddings
use_mcp_tool({
server_name: "supavec",
tool_name: "supavec_query",
arguments: {
query_vector: [0.9, 0.1, 0.2, ...], // Embedding of current user activity
top_k: 1,
filter: { "user_status": { "$eq": "active" } }
}
});
Connection String Format
The Supavec MCP server accepts connection URLs in the following format:
- Standard:
supavec://host:port?api_key=YOUR_API_KEY - With Namespace:
supavec://host:port/namespace?api_key=YOUR_API_KEY - TLS/SSL:
supavecs://host:port?api_key=YOUR_API_KEY
Sources
Related Articles
Azure DevOps MCP Server
The Azure DevOps MCP Server enables AI models to interact with Azure DevOps, providing capabilities for managing work items, pull requests, pipelines, and more.
DataBridge MCP Server
DataBridge MCP servers enable AI models to interact with local databases for contextual information, supporting persistent storage and unified access to ML services.
Cloudflare MCP Server
Interact with Cloudflare services using natural language through the Model Context Protocol.