Skip to main content

Usage

The #[mcp] macro enables you to expose any HelixQL query as an MCP (Model Context Protocol) endpoint, making it directly accessible to AI agents and LLM applications.
#[mcp]
QUERY QueryName(param1: Type, param2: Type) =>
    result <- traversal_expression
    RETURN result
You MUST only return a single object/value from the query otherwise you will get a compile error. (See E401)
Make sure to set mcp = true under the instance you are using in the helix.toml file.

How it works

When you add the #[mcp] attribute to a query:
  1. HelixDB automatically registers the query as an MCP tool
  2. The query parameters become the tool’s input schema
  3. The query’s return type becomes the tool’s output schema
  4. AI agents can call this tool directly through the MCP server
The MCP macro automatically converts your HelixQL query into a callable MCP tool that can be used by LLM providers like OpenAI, Anthropic, and Gemini.

Using MCP Queries with LLM Providers

Once you’ve defined your MCP queries, you can use them with any LLM provider that supports MCP:
#[mcp]
QUERY get_user(user_id: ID) =>
    user <- N<User>(user_id)
    RETURN user
from helix.client import Client
from helix.mcp import MCPServer
from helix.providers.openai_client import OpenAIProvider
from fastmcp.tools.tool import FunctionTool

# Initialize MCP server
client = Client(local=True)
mcp = MCPServer("helix-mcp", client)

# Add your custom tool to the MCP server
def get_user(connection_id: str, user_id: str):
    """
    Get the name of a user

    Args:
        id: The ID of the user

    Returns:
        The name of the user
    """
    return client.query("get_user", {"connection_id": connection_id, "user_id": user_id})

mcp.add_tool(FunctionTool.from_function(get_user))

# Start MCP server
mcp.run_bg()

# Enable MCP in your LLM provider
llm = OpenAIProvider(
    name="openai-llm",
    instructions="You are a helpful assistant with access to user data.",
    model="gpt-4o",
    history=True
)
llm.enable_mcps("helix-mcp")

# The AI can now call your MCP queries
response = llm.generate(f"What is the name of user with ID {user_id}?")
print(response)

Best Practices

Choose query names that clearly describe what the tool does. AI agents rely on function names to understand capabilities.
// Good
#[mcp]
QUERY get_user_purchase_history(user_id: ID) => ...

// Less clear
#[mcp]
QUERY query1(id: ID) => ...
Use clear parameter types and avoid overly complex signatures. AI agents work best with straightforward interfaces.
// Good
#[mcp]
QUERY search_products(name: String, max_price: F64) => ...

// More complex - consider breaking into multiple queries
#[mcp]
QUERY complex_search(filters: [FilterType], options: SearchOptions) => ...

⌘I