1

I created the following Java class and added it to Hive after making a jar out of it

import org.apache.hadoop.hive.ql.exec.UDF;
import org.apache.hadoop.io.Text;

public class MakeCap extends UDF{
  private Text t;

  public Text evaluate(Text input){
    if(null==input){
      t.set("Invalid input");
    }else{
      t.set(input.toString().toUpperCase());
    }
    return t;
  }
}

Next, I created a temporary function

CREATE TEMPORARY FUNCTION CAP AS 'com.iris.MakeCap';

But when I run

SELECT CAP('hello');

I get the following error

Error: Error while compiling statement: FAILED: SemanticException [Error 10014]: Line 1:7 Wrong arguments ''hello'': org.apache.hadoop.hive.ql.metadata.HiveException: 
Unable to execute method public org.apache.hadoop.io.Text com.iris.MakeCap.evaluate(org.apache.hadoop.io.Text) 
with arguments {hello}:null (state=42000,code=10014)

I tried to use String instead of Text as the argument type for evaluate() but got the same result. Then I also tried this

SELECT CAP(e.name) FROM default.emp e;

and got the same error. Could someone help me with this?

Amber
  • 914
  • 6
  • 20
  • 51

1 Answers1

1

Try to replace Hadoop's Text type with simple Java's String type for both input and return. For UDF class it works fine. If you want to stick with Text I think you need to initialize your private variable t, e.g.

private final transient t = new Text()

Here is an example code of Hive's UDF.

serge_k
  • 1,772
  • 2
  • 15
  • 21