Getting Started With Helix-MCP

HelixDB conveniently has custom MCP endpoints built into it making it easy for AI agents to interface with it. We also provide an MCP server that you can hook right up to any LLM provider you’re using! This guide shows how to run the server, enable tools, and connect from LLM providers and Claude Desktop.

Prerequisites

  • A running HelixDB instance (local or remote)
  • API keys as needed: OPENAI_API_KEY, GEMINI_API_KEY, VOYAGEAI_API_KEY, ANTHROPIC_API_KEY

Setting up HelixDB

Before running the server in Claude Desktop, you also want to make sure that you have an actual running Helix instance. See our installation guide for that.

Run the MCP Server

from helix.client import Client
from helix.mcp import MCPServer

client = Client(local=True)
mcp = MCPServer("helix-mcp", client)
mcp.run() # runs on http://localhost:8000/mcp/
ToolConfig is used to enable/disable tools.
from helix.mcp import ToolConfig

# Disable search_keyword tool
tool_config = ToolConfig(search_keyword=False)
mcp = MCPServer("helix-mcp", client, tool_config=tool_config)
Embedder is used to embed text into a vector.
from helix.embedding.openai_client import OpenAIEmbedder
embedder = OpenAIEmbedder()
mcp = MCPServer("helix-mcp", client, embedder=embedder)

MCP Tools

  • Session: init, next, collect, reset
  • Traversal: n_from_type, e_from_type, out_step, out_e_step, in_step, in_e_step
  • Filter: filter_items
  • Search: search_vector (needs embedder), search_vector_text (in-house embedding), search_keyword (BM25)
For more information about the MCP tools, see our MCP Guide.

Enable MCP in Your LLM Provider

OpenAI:
from helix.providers.openai_client import OpenAIProvider
llm = OpenAIProvider(name="openai-llm", instructions="You are helpful...", model="gpt-5-nano", history=True)
llm.enable_mcps("helix-mcp") # defaults to http://localhost:8000/mcp/
Gemini:
from helix.providers.gemini_client import GeminiProvider
llm = GeminiProvider(model="gemini-2.5-flash", history=True)
llm.enable_mcps("helix-mcp") # defaults to http://localhost:8000/mcp/
Anthropic:
from helix.providers.anthropic_client import AnthropicProvider
llm = AnthropicProvider(model="claude-3-5-haiku-20241022", history=True)
llm.enable_mcps("helix-mcp", url="https://your-remote-mcp.example.com/mcp/")
Anthropic does not support local MCP servers with streamable-http. You must use a URL-based MCP server.

Use With Claude Desktop

  1. Get the server script:
curl -O https://raw.githubusercontent.com/HelixDB/helix-py/refs/heads/main/apps/mcp_server.py
  1. Create a uv project and install:
uv init project
cp mcp_server.py project
cd project
uv venv && source .venv/bin/activate
uv add helix-py
  1. Add your embedder to the server script:
from helix.embedding.openai_client import OpenAIEmbedder
embedder = OpenAIEmbedder()
mcp_server = MCPServer("helix-mcp", client, embedder=embedder)
  1. Configure Claude Desktop (macOS path shown):
{
  "mcpServers": {
    "helix-mcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/project",
        "run",
        "mcp_server.py"
      ]
    }
  }
}
Don’t forget to adjust the path to your mcp_server.py in the snippet!