You are currently viewing SnowSQL – Unload Snowflake table to Amazon S3

Using SnowSQL COPY INTO statement you can unload the Snowflake table in a Parquet, CSV file formats straight into Amazon S3 bucket external location without using any internal stage and use AWS utilities to download from the S3 bucket to your local file system.

Advertisements

Pre-requisite

Install Snowflake CLI to run SnowSQL commands.

Below URL takes you to the Snowflake index download page, navigate to the OS you are using and download the binary and install.


https://sfc-repo.snowflakecomputing.com/snowsql/bootstrap/index.html

Connect to Snowflake database and schema

syntax


user>snowsql -a accountname -u user -d dtabasename -s schemaname

For example, below Snowsql statement connects to account “oea82.us-east-1”, user “nkmr”, database “employee” and schema “public”.


user>snowsql -a oea82.us-east-1 -u nkmr -d employee -s public

Password:
* SnowSQL * v1.2.5
Type SQL statements or !help
nkmr#[email protected]>

Unload Snowflake table to S3 bucket

Using SnowSQL COPY INTO statement, you can unload the Snowflake table direct to Amazon S3 bucket external location. In below example, we are exporting from table EMP.

To connect to AWS, you need to provide the AWS key, secret key and token, use credentials property to define credentials = (aws_key_id='xxxx' aws_secret_key='xxxxx' aws_token='xxxxxx').


COPY INTO 's3://mybucket/unload/'
  from EMP
  credentials = (aws_key_id='xxxx' aws_secret_key='xxxxx' aws_token='xxxxxx');
1 Row(s) produced. Time Elapsed: 1.300s

As you see above, we have 5 records in the EMP table and all records have been unloaded to specified S3 bucket wit file name data_0_0_0.csv.gz.

By default COPY INTO unload the data into CSV file with a header and compress the file with gzip.

Unload without column names on header

Use HEADER optional parameter to specify whether to include the table column headings in the output files, by default it is set to TRUE, you can change it to FALSE if you do not want column names of a header on output file. for example.


COPY INTO 's3://mybucket/unload/'
  from EMP HEADER = FALSE 
  
1 Row(s) produced. Time Elapsed: 1.300s

Unload with different compression

Use COMPRESSION to change the default compression, By default, it unloads a file into GZIP format, however, you can change it to use the following compressions AUTO | GZIP | BZ2 | BROTLI | ZSTD | DEFLATE | RAW_DEFLATE | NONE


COPY INTO 's3://mybucket/unload/'
  from EMP  file_format = (type=CSV COMPRESSION = BZ2) 
  
1 Row(s) produced. Time Elapsed: 1.300s

Use NONE, if you want the file to unload in plain raw CSV file without default compression.

Unload with different delimiter

Use RECORD_DELIMITER to change the default delimiter, by default it used ‘,’ character, but you can change to use any character, for example, let’s see how to use pipe delimiter


COPY INTO 's3://mybucket/unload/'
  from EMP  file_format = (type=CSV COMPRESSION = NONE RECORD_DELIMITER='|') 
  
1 Row(s) produced. Time Elapsed: 1.300s

You can find more option at this link

Unload Snowflake table in Parquet format

Use the file_format = (type = parquet) to export in a Parquet file format.


COPY INTO 's3://mybucket/unload/'
  from EMP file_format = (type = parquet) 
  credentials = (aws_key_id='xxxx' aws_secret_key='xxxxx' aws_token='xxxxxx');

1 Row(s) produced. Time Elapsed: 1.300s

Now to download the unloaded file from Amazon S3 bucket, use the AWS utility to download files from the S3 bucket to the local file system.

Unload with different compression

Use COMPRESSION to change the default compression, By default, it unloads a parquet file using snappy format, however, you can change it to use the following compressions AUTO | LZO | SNAPPY | NONE


COPY INTO 's3://mybucket/unload/'
  from EMP file_format = (type = parquet COMPRESSION=LZO) 

1 Row(s) produced. Time Elapsed: 1.300s

Use NONE, if you want parquet file to unload in raw parquet file without default compression.

Conclusion

In this article, you have learned how to use SnowSQL and unload the Snowflake table into an Amazon S3 bucket external location without using the internal stage.

Reference

Snowflake reference

Happy Learning !!

This Post Has 2 Comments

  1. NNK

    Sure. will provide more articles. Thanks for reading !!

  2. sreenivasulureddy Gunupati

    please create more stuf about the snowflake db with python and more scenario about the snowflake db

Comments are closed.