What happened if you upload a 22.8 GB file using the PUT command in Snowflake Datawarehouse?

Vidit tyagi
2 min readFeb 14, 2022

--

huge file load into Snowflake warehouse

Hi,

I am sharing an experience about loading a huge file in snowflake Datawarehouse, so my file is in CSV format the best one for snowflakes with higher upload speed. My file size was 22.8 GB without compression; however, we know when we staged the data into snowflake stages, it gets compressed automatically.

Let’s see what the results are.

First, I merged hundreds of files together in a single merge.csv file using the copy command

copy *.csv merge.csv

As I already have a table Part inside my snowflake Database, so I am using the table stage to upload the data using the PUT command into snowflake data warehouse.

and the interesting thing is I found that command has been successfully executed as per SNOWFLAKE UI History which took only 1 sec to execute :)

But still, SnowSQL command line is under progress for a long time and after 3292.532 seconds, I got the 403 forbidden error from AWS S3 (default snowflake storage account).

snowflake put command response with huge file

While I tried again it gets loaded this time into the stage which took a very long time 10522.019s.

So, the conclusion is, do not process a huge file into Snowflake, break it into multiple parts, and also it is recommended by Snowflake to use a 100–250MB compressed file to upload into stages.

happy learning 😊😊😊

--

--

Vidit tyagi
Vidit tyagi

Written by Vidit tyagi

A data scientist in cloud data warehouse

No responses yet