The #[mcp] macro enables you to expose any HelixQL query as an MCP (Model Context Protocol) endpoint, making it directly accessible to AI agents and LLM applications.
Copy
Ask AI
#[mcp]QUERY QueryName(param1: Type, param2: Type) => result <- traversal_expression RETURN result
You MUST only return a single object/value from the query otherwise you will get a compile error. (See E401)
Make sure to set mcp = true under the instance you are using in the helix.toml file.
Once you’ve defined your MCP queries, you can use them with any LLM provider that supports MCP:
Copy
Ask AI
#[mcp]QUERY get_user(user_id: ID) => user <- N<User>(user_id) RETURN user
Copy
Ask AI
from helix.client import Clientfrom helix.mcp import MCPServerfrom helix.providers.openai_client import OpenAIProviderfrom fastmcp.tools.tool import FunctionTool# Initialize MCP serverclient = Client(local=True)mcp = MCPServer("helix-mcp", client)# Add your custom tool to the MCP serverdef get_user(connection_id: str, user_id: str): """ Get the name of a user by their ID Args: connection_id: The connection ID, user_id: The ID of the user Returns: The user object """ return client.query( "mcp/get_userMcp", {"connection_id": connection_id, "data":{"user_id": user_id}} )mcp.add_tool(FunctionTool.from_function(get_user))# Start MCP servermcp.run_bg()# Enable MCP in your LLM providerllm = OpenAIProvider( name="openai-llm", instructions="You are a helpful assistant with access to user data.", model="gpt-4o", history=True)llm.enable_mcps("helix-mcp")# The AI can now call your MCP queriesresponse = llm.generate(f"What is the name of user with ID {user_id}?")print(response)