Python SDK helix-py 
@tags: python, sdk, client, queries, pytorch, llm, provider, embedding, chunking, mcp, model context protocol, pydantic, cloud
TL;DR
- Install:
uv add helix-pyorpip install helix-py - Connect:
db = helix.Client(local=True) - Query:
db.query('add_user', {"name": "John"}) - Embed:
OpenAIEmbedder().embed("text") - Chunk:
Chunk.semantic_chunk(text) - LLM Ready: Built-in support for OpenAI, Gemini, and Anthropic providers with MCP tools
- Vector Operations: Native embedding support with OpenAI, Gemini, and VoyageAI embedders
- Text Processing: Integrated chunking with Chonkie for document processing
- Instance Management: Programmatic control over HelixDB instances (start/stop/deploy)
- Type Safe: Pydantic models for structured data and responses
- Local & Cloud: Works with both local Docker instances and cloud deployments
Installation
Client
Connect to a running helix instance:- Default port:
6969 - Change port: pass
portparameter - Cloud instances: local=False, pass
api_endpointparameter, optionallyapi_keyparameter
Custom Queries (not recommended for simple queries)
PyTorch-like Query Definition
Given a HelixQL query:Query.querymethod must return a list of objects- Query name is case sensitive
Instance Management
Setup and manage helix instances programmatically:helixdb-cfg: directory for configuration files- Instance auto-stops on script exit
LLM Providers
Available providers:
OpenAIProviderGeminiProviderAnthropicProvider
Environment variables required:
OPENAI_API_KEYGEMINI_API_KEYANTHROPIC_API_KEY
Provider methods:
enable_mcps(name: str, url: str=...) -> boolgenerate(messages, response_model: BaseModel | None=None) -> str | BaseModel
Message formats supported:
- Free-form text: string
- Message lists: list of dict or provider-specific
Messagemodels - Structured outputs: Pydantic model validation
Example usage:
MCP tools setup:
Model notes:
- OpenAI GPT-5 family: supports reasoning
- Anthropic: local streamable MCP not supported, use URL-based MCP
Embedders
Available embedders:
OpenAIEmbedderGeminiEmbedderVoyageAIEmbedder
Embedder interface:
embed(text: str, **kwargs) -> [F64]embed_batch(texts: List[str], **kwargs) -> [[F64]]
Usage examples:
Chunking
Uses Chonkie for text processing:TypeScript SDK helix-ts 
@tags: typescript, sdk, client, queries, type-safe, graph, vector, knowledge-graph, search, llm-pipeline
TL;DR
- Install:
npm install helix-ts - Connect:
new HelixDB("http://localhost:6969") - Query:
client.query("QueryName", params) - Type Safe: Full TypeScript support with schema validation
- Graph & Vector: Native support for both graph and vector operations
- Knowledge Graphs: Built for knowledge graph construction
- LLM Pipelines: Ideal for search systems and LLM integrations
- Async/Await and Batch operations supported
Installation
Configuration
Basic connection:
Cloud endpoint:
Error handling:
Quick Start
Advanced Usage
Async/await pattern:
Batch operations:
Rust SDK helix-rs 
@tags: rust, sdk, client, queries, type-safe, graph, vector, knowledge-graph, search, llm-pipeline, async, serde, tokio
TL;DR
- Install:
cargo add helix-rs serde tokio - Connect:
HelixDB::new(Some("http://localhost"), Some(6969), None) - Query:
client.query("QueryName", &payload).await? - Type Safe: Full Rust type safety with serde_json
- Async/Await: Built on tokio for async operations
- Graph & Vector: Native support for both graph and vector operations
- Knowledge Graphs: Ideal for knowledge graph construction
- LLM Pipelines: Perfect for search systems and LLM integrations
Quick Start
Installation
Cargo CLI:
Cargo.toml:
Configuration
Basic connection:
Custom endpoint:
Advanced Usage
Async patterns:
Batch operations:
Go SDK helix-go 
@tags: go, sdk, client, queries, type-safe, graph, vector, knowledge-graph, search, llm-pipeline, goroutines, context
TL;DR
- Install:
go get github.com/HelixDB/helix-go - Connect:
helix.NewClient("http://localhost:6969") - Query:
client.Query("QueryName", helix.WithData(payload)).Scan(&result) - Type Safe: Full Go type safety with
map[string]any - Goroutines: Built for concurrent operations
- Graph & Vector: Native support for both graph and vector operations
- Knowledge Graphs: Ideal for knowledge graph construction
- LLM Pipelines: Perfect for search systems and LLM integrations