Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

s3 file access slow

I have read json filedata from s3 folder using python component to redshift table.

import json
from datetime import datetime
from datetime import timedelta
import boto3
import re
cursor = context.cursor()
s3 = boto3.resource('s3')

bucket = s3.Bucket('folder')

objects = bucket.objects.all()

for object in objects:
if object.key.startswith('folder/') and object.key.endswith('filename'):
print filename

but it will take 3 min for just filename was basic or need to optimize my script?please advice


1 Community Answers

Matillion Agent  

Ian Funnell —

Hi Ganesh,

Regarding your question on JSON handling, please can you email directly, attaching an export of the Matillion job that you have developed so far?

Many thanks,

Best regards,

Post Your Community Answer

To add an answer please login