Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Table compression


I want to compress a table from mssql and load to redshift because large amount of data.

can u please suggest idea for this process using matillion?


1 Community Answers

Matillion Agent  

Kalyan Arangam —

Hi Mathan,

There are no direct options to compress data. Please check for any options supported by MySQL JDBC driver. You may then set these options under JDBC Options when using RDS Query component or ‘Connection Options’ for the Database Query component.

Typically, all load components will compress the data prior to writing to S3. However, I cannot say the same for data flowing from your source (mssql) to matillion.

When importing very large datasets, we’d recommend bringing data in batches rather than all in one go. For example, if your source table has a date column, you may write a WHERE clause and bring data for an year at a time; or any other approach that would let you break/partition your dataset into manageable chunks.

This is usually a concern for initial loads for large tables. If you are unable to determine field(s) to suitably partition your data, then I recommend exporting the relevant data to one or more delimited files, zip and upload them to an S3 Bucket and load using a Copy command or using S3 Load orchestration component.

Hope that makes sense.


Post Your Community Answer

To add an answer please login