Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Table update in SQL server database to trigger Matillion job

How can we go about triggering Matillion job based on an update in SQL server database table for certain column. We have a scenario where we want near real time update in Redshift model if certain columns get updated in a SQL server table. Any high level pointers are very much appreciated.

3 Community Answers

Matillion Agent  

Kalyan Arangam —

Hi Devang,

Here’s an article on using microbatching to repeatedly query a source database and pull data to redshift. Please chek if this suits your requirement.

Besides the above, there are two ways to remotely launch jobs in matillion

  1. Using our REST API
  2. Writing to an SQS queue monitored by the matillion instance

I am not aware of what features exist in SQL server to support your requirements. Please check with your DBA or equivalent on whether SQL server triggers can launch an external process by some means. I am not sure they do but its been a while since I’ve dealt with SQL Server.


Gaurav Tripathi —

Hi Kalyan,

Can we a get sample example for microbatching with redshift. The above link doesn't have any.


Matillion Agent  

Ian Funnell —

Hi Gaurav,

Thanks for bringing that documentation problem to our attention. We are correcting the Snowflake example job on the Redshift page.

Regarding the architecture, and mainly to avoid doubt.. the microbatching approach is not triggered by a change in SQL Server: it simply repeatedly queries the source system to check for changes. In other words, it’s a pull model rather than a push.

Best regards,

Post Your Community Answer

To add an answer please login