Skip to main content

Exporting SQL Data and Ingesting into Helix

This guide will walk you through the process of exporting your SQL data, uploading it to a website, and then to Amazon S3. Finally, the data will be ingested into a Helix instance and written to a graph database. This process currently supports PostgreSQL.

Steps

  1. SSH into the Instance First, SSH into the instance that has access to your PostgreSQL database. Ensure this instance can access the database you want to ingest.
    ssh user@your-instance-ip
    
  2. Install Helix CLI Install the Helix CLI using the following command:
    curl -sSL "https://install.helix-db.com" | bash
    
  3. Export Data from PostgreSQL The Helix CLI does not currently include a SQL ingestion command. Export your data with pg_dump so it can be uploaded in the Helix Console:
    pg_dump "postgres://<username>:<password>@<host>:<port>/<dbname>" > helix-export.sql
    
    • Replace <username>, <password>, <host>, <port>, and <dbname> with your PostgreSQL credentials and connection details.
  4. Upload Data to S3 After exporting your data, go to the Helix Console and upload the file.
  5. Helix Instance Pulls Data Once it has been uploaded, the Helix instance will pull the data and write it to the graph database.

Future Enhancements

  • Vector Support: Support for vectors will be added soon, enhancing the capabilities of data ingestion and processing.
This process ensures that your data is securely transferred and ingested into the Helix graph database, ready for analysis and use.