Txtai MCP Server

Txtai MCP servers enable AI models to interact with txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches.

GitHub starsnpm versionnpm downloads

Overview

The MCP Txtai Server enables AI models to interact with Txtai, an AI-powered search engine that builds vector indexes (also known as embeddings) to perform similarity searches. Txtai is widely used for semantic search, retrieval-augmented generation (RAG), and building intelligent applications.

Community Server:

Developed and maintained by rmtech1

Key Features

Semantic Search

Perform semantic search across stored memories

🔑

Persistent Storage

Persistent storage with file-based backend

Tag-based Memory

Tag-based memory organization and retrieval

💾

Claude and Cline AI Integration

Integration with Claude and Cline AI

Available Tools

Quick Reference

ToolPurposeCategory
store_memoryStore new memory contentWrite
retrieve_memoryRetrieve memories based on semantic searchRead
search_by_tagSearch memories by tagsRead
delete_memoryDelete a specific memoryWrite
get_statsGet database statisticsDiscovery
check_healthCheck database and embedding model healthDiscovery

Detailed Usage

store_memory

Store new memory content with metadata and tags.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "store_memory",
  arguments: {
    content: "Important information to remember",
    tags: ["important"]
  }
});

Returns a confirmation message.

retrieve_memory

Retrieve memories based on semantic search.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "retrieve_memory",
  arguments: {
    query: "what was the important information?",
    n_results": 5
  }
});

Returns a list of matching memories.

search_by_tag

Search memories by tags.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "search_by_tag",
  arguments: {
    tags: ["important", "context"]
  }
});

Returns a list of matching memories.

delete_memory

Delete a specific memory by content hash.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "delete_memory",
  arguments: {
    content_hash: "hash_value"
  }
});

Returns a confirmation message.

get_stats

Get database statistics.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "get_stats",
  arguments: {}
});

Returns database statistics.

check_health

Check database and embedding model health.

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "check_health",
  arguments: {}
});

Returns the health status of the server.

Installation

{
  "mcpServers": {
    "txtai-assistant": {
      "command": "path/to/txtai-assistant-mcp/scripts/start.sh",
      "env": {}
    }
  }
}

Custom Connection:

The server can be configured using environment variables in the .env file.

Common Use Cases

1. Store and retrieve memories

Store and retrieve memories based on semantic search:

// Store a memory
use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "store_memory",
  arguments: {
    content: "Important information to remember",
    tags: ["important"]
  }
});

// Retrieve memories
use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "retrieve_memory",
  arguments: {
    query: "what was the important information?",
    n_results": 5
  }
});

2. Search by tags

Search memories by tags:

use_mcp_tool({
  server_name: "txtai-assistant",
  tool_name: "search_by_tag",
  arguments: {
    tags: ["important", "context"]
  }
});

Sources