Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB



I'm creating json array from redshift table using python script...

there after i want to save these array as file in s3 bucket..

Any possibilities to move in s3 as json file (or) csv file....



3 Community Answers

Matillion Agent  

Kalyan Arangam —

hi Mathan,

Please explore the python boto/boto3 api for this purpose. These packages are already installed and configured on the matillion instance.

There must be examples on the web on using these. I would consider writing to a local file in the appropriate format (json/csv) and then transferring it to S3. There may be options to directly write to S3 as well.


Mathan Selvaraj —

Hi Kalyan,

I have moved files to s3.. but file writen only one row of array.

How to write all the data as a file to s3..



Matillion Agent  

Kalyan Arangam —

Hi Mathan,

Its hard to comment without looking at your python code.

Looks like your array is holding just one row at the time of writing to a file. Please review your logic on populating the array and the way you are writing to the file.

This is now moving into pure python rather than matillion. Please review your logic around populating the arrays and how its being written to a file.


Post Your Community Answer

To add an answer please login