Skip to content

Upload a file using the terminal

Upload CSV, MySQL, and Parquet files from your machine into your Tiger Cloud service using the terminal

This guide shows you how to upload CSV, MySQL, and Parquet files from a source machine into your Tiger Cloud service using the terminal. Use the tab below that matches your file type. Terminal-based imports are ideal for large files, scripting, and automation, and they avoid the size limits of the Console upload.

To follow the steps on this page:

  • Create a Tiger Cloud service and find your connection details.
  • Prepare data files on a machine that can reach your service (your laptop, a jump host, or a server):
    • For CSV: psql (or another PostgreSQL client) installed.
    • For MySQL: Access to the MySQL source (for example, mysqldump, MySQL client) and psql for the target.
    • For Parquet: A way to convert Parquet to CSV if you use COPY (for example, Python, parquet-tools), or a loader that supports Parquet; plus psql or timescaledb-parallel-copy for bulk load.
  • CSV: Use PostgreSQL COPY or \copy for high-throughput bulk load. UTF-8 recommended; set delimiter and header as needed.
  • MySQL: Export from MySQL (for example, mysqldump or CSV export), then import into your service with psql or pg_restore (for dumps). For ongoing sync, see Sync from PostgreSQL or migration guides.
  • Parquet: Convert to CSV then use COPY or timescaledb-parallel-copy, or use a Parquet-capable ETL tool that can write to PostgreSQL.
Tips

Use the same PostgreSQL major version as your target when running pg_dump/pg_restore for dumps. For large CSV or Parquet loads, timescaledb-parallel-copy can speed up imports.

Choose the tab that matches your source file type and follow the steps.

To load a CSV file into your service from the terminal:

  1. Get your connection string

    Use your Tiger Cloud connection details and set a connection string, for example: postgres://USER:PASSWORD@HOST:PORT/DATABASE?sslmode=require

  2. Create the target table (if needed)

    Define a table that matches your CSV columns and types. Example:

    CREATE TABLE my_data (
    time TIMESTAMPTZ,
    symbol TEXT,
    price DOUBLE PRECISION,
    volume BIGINT
    );
  3. Load the CSV with COPY or \copy
    • From your local machine: Use \copy in psql (client-side; file must be on your machine):
    Terminal window
    psql "postgres://USER:PASSWORD@HOST:PORT/DATABASE?sslmode=require" -c "\copy my_data FROM 'path/to/file.csv' WITH (FORMAT csv, HEADER true, DELIMITER ',');"
    • From a server that can reach the DB: Use COPY in SQL (server-side; file path is on the server) or run \copy from a client that has the file. For very large files, consider timescaledb-parallel-copy for parallel loading.
  4. Verify the import

    Query the table in psql or the SQL editor to confirm row counts and sample data.

Your data is now in your Tiger Cloud service. For time-series tables, consider converting them to hypertables for better performance.

  • Permission denied: Ensure your database user has CREATE and INSERT (and USAGE on the schema) on the target tables.
  • Encoding errors: Use UTF-8 for CSV files and connection encoding to avoid corruption.
  • Connection timeouts: For large imports, use a stable network; increase timeouts if your client supports it.
  • Foreign key errors: Create and load tables in dependency order, or temporarily defer constraint checks during import.

You can upload CSV, MySQL, and Parquet data into your Tiger Cloud service from the terminal: From CSV with COPY/\copy, from MySQL by exporting to CSV or using a migration tool, and from Parquet by converting to CSV then loading. For a UI-based upload, use Upload a file using the Console; for continuous sync from S3, use Sync from S3.