Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

s3 file access slow

Hi,
I have read json filedata from s3 folder using python component to redshift table.

import json
from datetime import datetime
from datetime import timedelta
import boto3
import re
cursor = context.cursor()
s3 = boto3.resource('s3')

bucket = s3.Bucket('folder')

objects = bucket.objects.all()

for object in objects:
if object.key.startswith('folder/') and object.key.endswith('filename'):
datas=object.get()['Body']
filename=object.key
print filename


but it will take 3 min for just filename read....it was basic or need to optimize my script?please advice

Thanks

1 Community Answers

Matillion Agent  

Ian Funnell —

Hi Ganesh,

Regarding your question on JSON handling, please can you email support@matillion.com directly, attaching an export of the Matillion job that you have developed so far?

Many thanks,

Best regards,
Ian

Post Your Community Answer

To add an answer please login