Skip to main content

Getting Started With Helix-MCP

HelixDB conveniently has custom MCP endpoints built into it making it easy for AI agents to interface with it. We also provide an MCP server that you can hook right up to any LLM provider you’re using! This guide shows how to run the server, enable tools, and connect from LLM providers and Claude Desktop.

Prerequisites

  • A running HelixDB instance (local or remote)
  • API keys as needed: OPENAI_API_KEY, GEMINI_API_KEY, VOYAGEAI_API_KEY, ANTHROPIC_API_KEY

Setting up HelixDB

Before running the server in Claude Desktop, you also want to make sure that you have an actual running Helix instance. See our installation guide for that.

Run the MCP Server

from helix.client import Client
from helix.mcp import MCPServer

client = Client(local=True)
mcp = MCPServer("helix-mcp", client)
mcp.run() # runs on http://localhost:8000/mcp/
ToolConfig is used to enable/disable tools.
from helix.mcp import ToolConfig

# Disable search_keyword tool
tool_config = ToolConfig(search_keyword=False)
mcp = MCPServer("helix-mcp", client, tool_config=tool_config)
Embedder is used to embed text into a vector.
from helix.embedding.openai_client import OpenAIEmbedder
embedder = OpenAIEmbedder()
mcp = MCPServer("helix-mcp", client, embedder=embedder)
For more information about the embedder, see our Helix Embedders guide.

MCP Tools

  • Session: init, next, collect, reset
  • Traversal: n_from_type, e_from_type, out_step, out_e_step, in_step, in_e_step
  • Filter: filter_items
  • Search: search_vector (needs embedder), search_vector_text (in-house embedding), search_keyword (BM25)
For more information about the MCP tools, see our MCP Guide.

Custom MCP Tools

You can also add your own custom MCP tools to the MCP server.
from fastmcp.tools.tool import FunctionTool
def get_user_name(id: str):
    """
    Get the name of a user

    Args:
        id: The ID of the user

    Returns:
        The name of the user
    """
    return client.query("get_user_name", {"id": id})

mcp.add_tool(FunctionTool.from_function(get_user_name))

Enable MCP in Your LLM Provider

OpenAI:
from helix.providers.openai_client import OpenAIProvider
llm = OpenAIProvider(name="openai-llm", instructions="You are helpful...", model="gpt-5-nano", history=True)
llm.enable_mcps("helix-mcp") # defaults to http://localhost:8000/mcp/
Gemini:
from helix.providers.gemini_client import GeminiProvider
llm = GeminiProvider(model="gemini-2.5-flash", history=True)
llm.enable_mcps("helix-mcp") # defaults to http://localhost:8000/mcp/
Anthropic:
from helix.providers.anthropic_client import AnthropicProvider
llm = AnthropicProvider(model="claude-3-5-haiku-20241022", history=True)
llm.enable_mcps("helix-mcp", url="https://your-remote-mcp.example.com/mcp/")
Anthropic does not support local MCP servers with streamable-http. You must use a URL-based MCP server.
For more information about LLM providers, see our Helix LLM Providers guide.

Usage

  1. Get the server script in the directory you want to run the MCP server:
cd /absolute/path/to/project
curl -O https://raw.githubusercontent.com/HelixDB/helix-py/refs/heads/main/apps/mcp_server.py
  1. Configure your MCP server in the server script:
  • Modify the client to where your HelixDB instance is running.
  • Modify the embedder to your preferred embedder or remove it if you don’t want to use an embedder.
  • Modify the tool_config to your preferred set of tools.
  1. Configure MCP Server on your platform below.
For stdio transport, don’t forget to adjust the path to your mcp_server.py in the snippet.
For streamable-http transport, don’t forget to run the MCP server first before using it.
For stdio transport, you may need to run which uv and update the command to the path of the uv binary in the MCP server configuration.

Cursor

Go to Cursor’s settings, click on MCP & Integrations -> New MCP Server, then enter the following: For streamable-http transport (adjust the port accordingly):
{
  "mcpServers": {
    "helix-mcp": {
      "url": "http://localhost:8000/mcp/"
    }
  }
}
For stdio transport:
{
  "mcpServers": {
    "helix-mcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/project",
        "run",
        "--python", "3.11",
        "--with", "helix-py",
        "mcp_server.py"
      ]
    }
  }
}

Windsurf

Go to Windsurf’s Casecade settings, navigate to MCP Servers, click on Manage MCPs -> View raw config, then enter the following: For streamable-http transport (adjust the port accordingly):
{
  "mcpServers": {
    "helix-mcp": {
      "serverUrl": "http://localhost:8000/mcp/"
    }
  }
}
For stdio transport:
{
  "mcpServers": {
    "helix-mcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/project",
        "run",
        "--python", "3.11",
        "--with", "helix-py",
        "mcp_server.py"
      ]
    }
  }
}

Claude Desktop

Go to Claude Desktop’s settings, click on Developer -> Edit Config, then enter the following into claude_desktop_config.json:
{
  "mcpServers": {
    "helix-mcp": {
      "command": "uv",
      "args": [
        "--directory",
        "/absolute/path/to/project",
        "run",
        "--python", "3.11",
        "--with", "helix-py",
        "mcp_server.py"
      ]
    }
  }
}

Codex

Go to your root directory and open the .codex folder (you may need to enable hidden files), then enter the following into the config.toml file:
[mcp_servers.helix-mcp]
command = "uv"
args = [
    "--directory",
    "/absolute/path/to/project",
    "run",
    "--python", "3.11",
    "--with", "helix-py",
    "mcp_server.py"
]
startup_timeout_ms = 20_000
⌘I