You can call built-in Hive UDFs, UDAFs, and UDTFs from Spark SQL applications, as long as the functions are available in the standard Hive .jar file.
When using Hive UDFs, use HiveContext (not SQLContext).
You can register custom functions in Python, Java, or Scala, and use them within SQL statements.
When using a custom UDF, make sure that the jar file for your UDF is included with your application, or use the --jars command-line option to specify the file.