Wednesday, November 13, 2013

UDF's Part 2: Custom GenericUDF in Hive (NVL2)

1.0. What's in this blog?

In my previous blog on creating custom UDFs in Hive, I covered a sample basic UDF.  This blog covers generic UDF creation, to mimic the same NVL2 functionality covered in the previous blog.  It includes sample data, java code for creating the UDF, expected results, commands to execute and the output.  
[hive 0.10]

About UDFs:
UDF stands for User Defined Function.  In Hive, there are (a) reusable functions available, as part of core Hive (out of the box) that can be used in Hive queries; They are called UDFs, even though they are not user-defined.  And then there are (b) functions that one can create in Java, also called UDFs, and use in Hive queries.  The focus of this blog is custom UDFs (b), specifically generic UDFs.

About generic UDF:
UDFs in Hive have are extensions of either UDF or GenericUDF classes.  GenericUDFs are more optimal from a performance perspective as they use short circuit evaluation and lazy evaluationwhen compared to UDFs that use reflection.  GenericUDFs support non-primitive Hive types like arrays, structs and maps in addition to primitive types, unlike UDFs that support only primitive types.  

About NVL2:
NVL2 takes three parameters, we will refer to as expr1, expr2 and expr3.
NVL2 lets you determine the value returned by a query based on whether a specified expression is null or not null. If expr1 is not null, then NVL2 returns expr2. If expr1 is null, then NVL2 returns expr3.

2.0. NVL2 generic UDF in Hive

1: Create the test data file for a Hive external table

2: Create the Hive table

3: Create the UDF in Java

4: Expected results

5: Try out the UDF

3.0. Making the UDF permanently available when you launch the hive shell

There are several ways to make a custom UDF available when you launch the Hive shell, bypassing the need to execute the "add jar..." statement before using a custom UDF.  I have listed a couple of them.

Option 1:
From "Programming Hive"

Your function may also be added permanently to Hive, however this requires a small modification to a Hive Java file and then rebuilding Hive.
Inside the Hive source code, a one-line change is required to the FunctionRegistry class found atql/src/java/org/apache/hadoop/hive/ql/exec/ Then you rebuild Hive following the instructions that come with the source distribution.
While it is recommended that you redeploy the entire new build, only the hive-exec-*.jar, where \* is the version number, needs to be replaced.
Option 2:
Add it to the .hiverc file on each node from where hive queries will be run.

4.0. References

Apache documentation:

A good article on creating a UDF that involves non-primitive types - link

Programming Hive - from O'Reilly

That's it for this blog.  Do share any additional insights with me.


  1. Great work. Thanks for sharing your work.

  2. Saved me a full week's work. Where on earth are you based?

  3. Glad the post helped you, Joseph.
    I am based out of Chicago, IL.

  4. thank you so much for the post !!!

  5. Hi,

    I am trying to execute the same above genric udf and facing "NULL POINTER EXCEPTION"

    hive> select * from departments_udftest;
    d001 marketing
    d002 finance
    d003 hr
    d006 testing
    Time taken: 0.238 seconds, Fetched: 6 row(s)

    hive> select deptno,nvl2(deptname,deptname,'test') from departments_udftest;
    FAILED: NullPointerException null

    Can you please let me know if i am missing something here?

  6. I guess you have already figured this out yourself, but for all others who struggle with this as I just did today: the line 52 in the java code (returnOIResolver = new GenericUDF...) should be moved before the third check. Otherwise you use object returnOIResolver before it is initiated.

  7. This comment has been removed by the author.

  8. thakyou it vry nice blog for beginners

  9. Good Post! Thank you so much for sharing this pretty post, it was so good to read and useful to improve my knowledge as updated one, keep blogging.

  10. Excellent article. Very interesting to read. I really love to read such a nice article. Thanks! keep rocking.Big Data Hadoop Online Training

  11. Good Post! Thank you so much for sharing this pretty post, it was so good to read and useful to improve my knowledge as updated one, keep blogging.

    Big Data Hadoop training in electronic city