Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Loading data from Redshift into RDS (SQL Server)

We have a scenario where some data needs to be transferred out of Redshift and loaded into a SQL Server RDS instance. Is there a Transformation to do this?

3 Community Answers

Matillion Agent  

Kalyan Arangam —

Hi Tejas,

We are able to perform bulk output to certain RDS databases as they support Java bulk-import interfaces and the respective utilities are either open source or allow the bulk-import drivers to be bundled with Matillion.

Unfortunately bcp utility which (I think) is the only bulk-load option for SQL Server cannot be run from linux. Also, microsoft’s licensing restrictions do not allow us to bundle their software with matillion.

Other customers have requested this feature in the past. Unfortunately, we are unable to support this option due to the reasons listed above. Our suggestion for SQL Server is to export data via S3 Unload then perform bulk import from a windows machine (optionally) hosted in AWS. You may use the S3 Get component to copy files from S3 to a Windows Share on this windows machine.

I think the latest version of SQL Server does run on Linux so I suspect its possible to run BCP on Linux in future.

Hope that gives some context.


Derik Hammer —

Laura, you said, "Another option is to use the S3 Unload component to send the data to files in S3. You can then import it to MS SQL server from there." How do you go about importing it to MS SQL Server from there?


Matillion Agent  

Kalyan Arangam —

Hi Derik,

You may need to use a windows workstation (ec2) with access to these files. You may use the AWS CLI to copy these files locally and then the bcp utility to load them into MS SQL server.

If a windows-file-share is visible from the matillion instance, you may use the FILE ITERATOR and S3 GET to copy files from S3 to the windows share.

Hope that helps.


Post Your Community Answer

To add an answer please login