I have created a number of UDF on Redshift but it seems like they cannot be called in Matillion. Error being the typical one: ERROR: function FUNCTION_NAME(bigint) does not exist Hint: No function matches the given name and argument types. You may need to add explicit type casts.
I have tried type casting but no luck. I thought it was because of some permission issues but when I connect directly via a client with the same credential, it works.
Is there anything I am missing?
15 Community Answers
Ian Funnell —
It’s certainly possible to do this, yes. Please find attached some Matillion example jobs which create and use a UDF.
My guess would be that you really do need some datatype casts, since Redshift is quite picky about exact matches. In the attached example the sequence outputs BIGINT, and you have to cast to INT to make the data work with the UDF.
Just check again and it's still the case. Moreover, under the "Functions" listing in Calculation node there's a folder of "User Defined Functions". I support all the UDF should be available there but it wasn't?
It all seems like something is different with the Redshift connection Matillion is having, even the account is the same?
Whilst the username/password may be the same, Its possible the matillion Environment is pointing at a different schema. All components in a transformation job default to the schema defined in your environment. Your UDF’s would be visible if the environment was pointing at the schema in which they were created.
Alternatively, you may explicitly prefix function-name with schema-name to use your UDF. For e.g. public.functionname()
Yep, you are right. If I changed the default schema to public it is ok. If I add 'public.' in front of the function it does not work but it works again in my own connection. Is there any other thing which restrict the search path, especially with functions? Can't seems to find anything from the web.
That's a slight bit of problem as I cannot use the functions in 'calculator' node, and it is not good for all other application to add udfs into a specific schema. Is there a way to set just the schema of the calculator for example?
You can refer to the function in the calculator component which is in a different schema by prefixing the schema name in front of the function. So if your schema is called public but your Matillion environment is against the test schema you can still call the function with public.function_name() in the calculator component. Matillion will only list functions available in the environment schema.
Provided your Matillion user in the environment has access to the schema that the function is in then you should be able to call the schema. Your schema would need to be in the same database as your Matillion environment. Do you get an error message when you try to reference the function in a different schema? Have you tried calling the function using a Redshift query analyzer tool from the same schema as you’re using in Matillion?
If you refer to the previous discussion I had with Ian: - If I change the default schema to public, it can use the udf. But I cannot set it in this way as do not want to see temporary data in public. - If I change the default schema to our temp schema, then prefix with 'public', it does not work but saying that there is no such function
I managed to get it working using our redshift so it sounds like a permissions issue. If it works for you could we have a call to take a look at this? Let me know some times which are best for you and we’ll set something up.