Snowflake Loader

PreClarity’s Snowflake Loader

With PreClarity’s Snowflake loader we take all the headaches away and make the load process fast and easy.

  • No S3 required! No need to create, manage, upload or pay for S3!
  • Easily loads files from local or remote mounted file systems with a single command!
  • Automatically detects file format automatically
  • Creates the database table structure with the correct column names & datatypes saving you time & eliminating manual effort!
  • Automatically removes or archives your files
  • Works with uncompressed or any popular compression/archive format (zip, gz, 7z, bz2, tar, gzip)

PreClarity no longer offers this as a free service. If you would like to inquire about implementing this as a paid service, please Contact us.

A Quick Use Case

So you’ve just got your shiny new Snowflake instance and you’re ready to load data.

If you use PreClarity’s Snowflake Loader : You’ll be automatically loading your data in minutes

If you don’t…
You’ll have a few things to think about and work through…
Did you buy an S3 bucket to use as an interim staging area for the files you want to load? You’re going to need one, that’s the only way Snowflake can load your file.
You’ll need to configure your S3 bucket and ensure it is in the same region as your database. You should probably configure a different user and set up proper permissions and get a new public and private key pair. Once that is done and you’re ready to move files, do some research on the SDK and write a program to move your files to S3, but as Amazon says in their documentation “this can be cumbersome because it requires you to write code to authenticate your requests”, so they suggest an alternative way which requires more authentication keys and another few pages of instructions.
Hours if not days later you might have been able to upload a file or two up into the S3 interim staging area. Now your headed over to the Snowflake side, time to visually inspect all the data in the file you are trying to load and manually create the DDL that matches your file layout with appropriate column names and the appropriate Snowflake column data types. Once all of that is accomplished it is time to go back to S3 and get your s3 bucket root, object prefixes and aws-auth-args so you can knuckle in the Snowflake command to load your file.
If the stars align and this all goes well you’ll have a single file loaded into Snowflake. Now on to cleanup to remove the file from S3 and ensure that it doesn’t get re-uploaded and loaded again. This is left as an exercise for the reader.