2015-11-01 236 views
0

我已经编写了解码cookie并返回字符串列表的UDF。 不幸的是我得到了蜂巢运行时错误在处理Hive UDF - Java String castexception

这里是我的代码:

@Override 
public ObjectInspector initialize(ObjectInspector[] input) throws UDFArgumentException { 

    ObjectInspector cookieContent = input[0]; 
    if (!(isStringOI(cookieContent))){ 
     throw new UDFArgumentException("only string"); 
    } 
    this.cookieValue = (StringObjectInspector) cookieContent; 
    return ObjectInspectorFactory.getStandardListObjectInspector 
      (PrimitiveObjectInspectorFactory.javaStringObjectInspector); 
} 


public Object evaluate(DeferredObject[] input) throws HiveException { 

    String encoded = cookieValue.getPrimitiveJavaObject(input[0].get()); 
    try { 
     result = decode(encoded); 
    } catch (CodeException e) { 
     throw new UDFArgumentException(); 
    } 

    return result; 
} 
public List<String> decode(String encoded) throws CodeException { 

    decodedBase64 = Base64.decodeBase64(encoded); 
    String decompressedArray = new String(getKadrs(decodedBase64)); 
    String kadr= decompressedArray.substring(decompressedArray.indexOf("|") + 1); 
    List<String> kadrsList= new ArrayList(Arrays.asList(kadr.split(","))); 
    return kadrsList; 
} 

private byte[] getKadrs(byte[] compressed) throws CodeException { 
    Inflater decompressor = new Inflater(); 
    decompressor.setInput(compressed); 
    ByteArrayOutputStream outPutStream = new ByteArrayOutputStream(compressed.length); 
    byte temp [] = new byte[1024]; 
    while (!decompressor.finished()) { 
     try { 
      int count = decompressor.inflate(temp); 
      outPutStream.write(temp, 0, count); 
     } 
     catch (DataFormatException e) { 
      throw new CodeException ("Wrong data format", e); 
     } 
    } 
    try { 
     outPutStream.close(); 
    } catch (IOException e) { 
     throw new CodeException ("Cant close outPutStream ", e); 
    } 
    return outPutStream.toByteArray(); 
} 

它的结果是,让说:

“kadr1,kadr20,kadr35,kadr12”。单元测试工作得很好,但是当我试图在蜂巢使用这个功能我得到这个:

Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.hadoop.io.Text 
  at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableStringObjectInspector.getPrimitiveWritableObject(WritableStringObjectInspector.java:41) 

其对我来说很难调试引起其他人必须执行我的jar看到的结果,因此,所有意见将不胜感激。

+0

我怀疑输入[0]获得()返回文本和您要转换为字符串 –

+0

@ravindra - 我认为你已经倒退了,因为错误说不能转换为文本。 –

+0

是的。你是对的。这是我的错误。 –

回答

0

您的evaluate方法当前返回一个String,它不是Hadoop数据类型。您应该用return new Text(result)将字符串包装在Text对象中。

0

拉温德拉是正确的

我在初始化
回报ObjectInspectorFactory.getStandardListObjectInspector (PrimitiveObjectInspectorFactory.writableStringObjectInspector);

和WritableStringObjectInspector返回文本

我改成了javaStringObjectInspector,它返回字符串所有的罚款 感谢