I am doing ETL of salesforce data to Redshift cluster. As you would be knowing that salesforce stores timestamps in UTC in its database, is there a way for me to do ETL of salesforce timestamp fields so that when I load the data to Redshift it should be transformed to local timestamp instead of UTC ?
4 Community Answers
Laura Malins —
You can use the CONVERT_TIMEZONE function in a calculator component to convert to a timezone of your choice
I will just try to give you a perspective on what i am trying to achieve. So basically I am using Salesforce orchestration component to extract and load few tables from salesforce and then I am using those tables as source tables to create transformation jobs using SQL query component and later to create new tables which will be used for analysis. The SQL script which I am using in the SQL query component uses quite a lot of joins and aggregations using data from different salesforce tables. Since this is the use case I am trying to figure out a way to alter the salesforce table timestamp columns to my timezone while i am loading the salesforce tables during my orchestration job.
I hope i have managed to explain my use case. That being said it would be great if you can suggest me an optimal solution for this problem.
You’re right that the calculator is only available in a Transformation job. The only way to perform a calculation on the source is using the advanced mode in the component and that calculation is dependent on the source. For example if the source is a MySQL db you can use the CONVERT_TZ() function. I don’t think there’s a way to convert it with Salesforce though so what I would recommend is staging your Salesforce data to one table within Redshift and then use a Transformation job to write this to a new table with the times converted and use this new Redshift table in your SQL script. You can always drop any table with the Drop Tables component.