Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Python to Redshift

Hi, I am trying to write Pandas dataframe object to Redshift table via Python component. Currently we are passing dataframe data to Matillion Grid variable which siginificantly slows down the process and leads to out-of-memory issues.
Can you please suggest a better approach?

TIA.

1 Community Answers

Matillion Agent  

Kalyan Arangam —

Hi Sanket,

One way is to up your instance size which may then give you sufficient memory for your operations.
Alternatively, have you tried writing to a file on disk, upload the file to S3 and then load into redshift?

Best
Kalyan

Post Your Community Answer

To add an answer please login