Search in sources :

Example 31 with UDFArgumentLengthException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException in project hive by apache.

the class GenericUDFTimestamp method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length < 1) {
        throw new UDFArgumentLengthException("The function TIMESTAMP requires at least one argument, got " + arguments.length);
    }
    SessionState ss = SessionState.get();
    if (ss != null) {
        intToTimestampInSeconds = ss.getConf().getBoolVar(ConfVars.HIVE_INT_TIMESTAMP_CONVERSION_IN_SECONDS);
    }
    try {
        argumentOI = (PrimitiveObjectInspector) arguments[0];
    } catch (ClassCastException e) {
        throw new UDFArgumentException("The function TIMESTAMP takes only primitive types");
    }
    tc = new TimestampConverter(argumentOI, PrimitiveObjectInspectorFactory.writableTimestampObjectInspector);
    tc.setIntToTimestampInSeconds(intToTimestampInSeconds);
    return PrimitiveObjectInspectorFactory.writableTimestampObjectInspector;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) SessionState(org.apache.hadoop.hive.ql.session.SessionState) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) TimestampConverter(org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.TimestampConverter)

Example 32 with UDFArgumentLengthException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException in project hive by apache.

the class GenericUDFToDecimal method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length < 1) {
        throw new UDFArgumentLengthException("The function DECIMAL requires at least one argument, got " + arguments.length);
    }
    try {
        argumentOI = (PrimitiveObjectInspector) arguments[0];
    } catch (ClassCastException e) {
        throw new UDFArgumentException("The function DECIMAL takes only primitive types");
    }
    // Check if this UDF has been provided with type params for the output varchar type
    SettableHiveDecimalObjectInspector outputOI;
    outputOI = (SettableHiveDecimalObjectInspector) PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(typeInfo);
    bdConverter = new HiveDecimalConverter(argumentOI, outputOI);
    return outputOI;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) HiveDecimalConverter(org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.HiveDecimalConverter) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) SettableHiveDecimalObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.primitive.SettableHiveDecimalObjectInspector)

Example 33 with UDFArgumentLengthException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException in project hive by apache.

the class GenericUDFSQCountCheck method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length > 2) {
        throw new UDFArgumentLengthException("Invalid scalar subquery expression. Subquery count check expected two argument but received: " + arguments.length);
    }
    converters[0] = ObjectInspectorConverters.getConverter(arguments[0], PrimitiveObjectInspectorFactory.writableLongObjectInspector);
    ObjectInspector outputOI = null;
    outputOI = PrimitiveObjectInspectorFactory.writableLongObjectInspector;
    return outputOI;
}
Also used : ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException)

Example 34 with UDFArgumentLengthException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException in project hive by apache.

the class GenericUDFSize method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 1) {
        throw new UDFArgumentLengthException("The function SIZE only accepts 1 argument.");
    }
    Category category = arguments[0].getCategory();
    String typeName = arguments[0].getTypeName();
    if (category != Category.MAP && category != Category.LIST && !typeName.equals(serdeConstants.VOID_TYPE_NAME)) {
        throw new UDFArgumentTypeException(0, "\"" + Category.MAP.toString().toLowerCase() + "\" or \"" + Category.LIST.toString().toLowerCase() + "\" is expected at function SIZE, " + "but \"" + arguments[0].getTypeName() + "\" is found");
    }
    returnOI = arguments[0];
    return PrimitiveObjectInspectorFactory.writableIntObjectInspector;
}
Also used : Category(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)

Example 35 with UDFArgumentLengthException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException in project hive by apache.

the class GenericUDFAbs method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 1) {
        throw new UDFArgumentLengthException("ABS() requires 1 argument, got " + arguments.length);
    }
    if (arguments[0].getCategory() != Category.PRIMITIVE) {
        throw new UDFArgumentException("ABS only takes primitive types, got " + arguments[0].getTypeName());
    }
    argumentOI = (PrimitiveObjectInspector) arguments[0];
    inputType = argumentOI.getPrimitiveCategory();
    ObjectInspector outputOI = null;
    switch(inputType) {
        case SHORT:
        case BYTE:
        case INT:
            inputConverter = ObjectInspectorConverters.getConverter(arguments[0], PrimitiveObjectInspectorFactory.writableIntObjectInspector);
            outputOI = PrimitiveObjectInspectorFactory.writableIntObjectInspector;
            break;
        case LONG:
            inputConverter = ObjectInspectorConverters.getConverter(arguments[0], PrimitiveObjectInspectorFactory.writableLongObjectInspector);
            outputOI = PrimitiveObjectInspectorFactory.writableLongObjectInspector;
            break;
        case FLOAT:
        case STRING:
        case DOUBLE:
            inputConverter = ObjectInspectorConverters.getConverter(arguments[0], PrimitiveObjectInspectorFactory.writableDoubleObjectInspector);
            outputOI = PrimitiveObjectInspectorFactory.writableDoubleObjectInspector;
            break;
        case DECIMAL:
            outputOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(((PrimitiveObjectInspector) arguments[0]).getTypeInfo());
            inputConverter = ObjectInspectorConverters.getConverter(arguments[0], outputOI);
            break;
        default:
            throw new UDFArgumentException("ABS only takes SHORT/BYTE/INT/LONG/DOUBLE/FLOAT/STRING/DECIMAL types, got " + inputType);
    }
    return outputOI;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) HiveDecimalObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.primitive.HiveDecimalObjectInspector) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)

Aggregations

UDFArgumentLengthException (org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException)47 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)29 PrimitiveObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)26 UDFArgumentException (org.apache.hadoop.hive.ql.exec.UDFArgumentException)21 UDFArgumentTypeException (org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)21 ConstantObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector)13 PrimitiveCategory (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory)6 PrimitiveObjectInspectorConverter (org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter)5 TimestampConverter (org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.TimestampConverter)5 ListObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector)4 MapObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.MapObjectInspector)4 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)4 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)3 StringObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector)3 ParseException (java.text.ParseException)2 ArrayList (java.util.ArrayList)2 DateWritable (org.apache.hadoop.hive.serde2.io.DateWritable)2 ObjectInspectorConverters (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters)2 HiveDecimalObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.primitive.HiveDecimalObjectInspector)2 StringConverter (org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.StringConverter)2