Search in sources :

Example 21 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFDateFormat method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    checkArgsSize(arguments, 2, 2);
    checkArgPrimitive(arguments, 0);
    checkArgPrimitive(arguments, 1);
    // the function should support both short date and full timestamp format
    // time part of the timestamp should not be skipped
    checkArgGroups(arguments, 0, tsInputTypes, STRING_GROUP, DATE_GROUP);
    checkArgGroups(arguments, 0, dtInputTypes, STRING_GROUP, DATE_GROUP);
    checkArgGroups(arguments, 1, tsInputTypes, STRING_GROUP);
    obtainTimestampConverter(arguments, 0, tsInputTypes, tsConverters);
    obtainDateConverter(arguments, 0, dtInputTypes, dtConverters);
    if (arguments[1] instanceof ConstantObjectInspector) {
        String fmtStr = getConstantStringValue(arguments, 1);
        if (fmtStr != null) {
            try {
                formatter = new SimpleDateFormat(fmtStr);
            } catch (IllegalArgumentException e) {
            // ignore
            }
        }
    } else {
        throw new UDFArgumentTypeException(1, getFuncName() + " only takes constant as " + getArgOrder(1) + " argument");
    }
    ObjectInspector outputOI = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
    return outputOI;
}
Also used : ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector) SimpleDateFormat(java.text.SimpleDateFormat)

Example 22 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFDecode method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 2) {
        throw new UDFArgumentLengthException("Decode() requires exactly two arguments");
    }
    if (arguments[0].getCategory() != Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(0, "The first argument to Decode() must be primitive");
    }
    PrimitiveCategory category = ((PrimitiveObjectInspector) arguments[0]).getPrimitiveCategory();
    if (category == PrimitiveCategory.BINARY) {
        bytesOI = (BinaryObjectInspector) arguments[0];
    } else if (category == PrimitiveCategory.VOID) {
        bytesOI = (VoidObjectInspector) arguments[0];
    } else {
        throw new UDFArgumentTypeException(0, "The first argument to Decode() must be binary");
    }
    if (arguments[1].getCategory() != Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(1, "The second argument to Decode() must be primitive");
    }
    charsetOI = (PrimitiveObjectInspector) arguments[1];
    if (PrimitiveGrouping.STRING_GROUP != PrimitiveObjectInspectorUtils.getPrimitiveGrouping(charsetOI.getPrimitiveCategory())) {
        throw new UDFArgumentTypeException(1, "The second argument to Decode() must be from string group");
    }
    // If the character set for decoding is constant, we can optimize that
    if (arguments[1] instanceof ConstantObjectInspector) {
        String charSetName = ((ConstantObjectInspector) arguments[1]).getWritableConstantValue().toString();
        decoder = Charset.forName(charSetName).newDecoder().onMalformedInput(CodingErrorAction.REPORT).onUnmappableCharacter(CodingErrorAction.REPORT);
    }
    return PrimitiveObjectInspectorFactory.javaStringObjectInspector;
}
Also used : UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) VoidObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.primitive.VoidObjectInspector) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector) PrimitiveCategory(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory)

Example 23 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFBaseNumeric method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 2) {
        throw new UDFArgumentException(opName + " requires two arguments.");
    }
    for (int i = 0; i < 2; i++) {
        Category category = arguments[i].getCategory();
        if (category != Category.PRIMITIVE) {
            throw new UDFArgumentTypeException(i, "The " + GenericUDFUtils.getOrdinal(i + 1) + " argument of " + opName + "  is expected to a " + Category.PRIMITIVE.toString().toLowerCase() + " type, but " + category.toString().toLowerCase() + " is found");
        }
    }
    // we have access to these values in the map/reduce tasks.
    if (confLookupNeeded) {
        CompatLevel compatLevel = HiveCompat.getCompatLevel(SessionState.get().getConf());
        ansiSqlArithmetic = compatLevel.ordinal() > CompatLevel.HIVE_0_12.ordinal();
        confLookupNeeded = false;
    }
    leftOI = (PrimitiveObjectInspector) arguments[0];
    rightOI = (PrimitiveObjectInspector) arguments[1];
    resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(deriveResultTypeInfo());
    converterLeft = ObjectInspectorConverters.getConverter(leftOI, resultOI);
    converterRight = ObjectInspectorConverters.getConverter(rightOI, resultOI);
    return resultOI;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) PrimitiveCategory(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory) Category(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category) CompatLevel(org.apache.hive.common.HiveCompat.CompatLevel) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)

Example 24 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDAFContextNGrams method getEvaluator.

@Override
public GenericUDAFEvaluator getEvaluator(TypeInfo[] parameters) throws SemanticException {
    if (parameters.length != 3 && parameters.length != 4) {
        throw new UDFArgumentTypeException(parameters.length - 1, "Please specify either three or four arguments.");
    }
    // Validate the first parameter, which is the expression to compute over. This should be an
    // array of strings type, or an array of arrays of strings.
    PrimitiveTypeInfo pti;
    if (parameters[0].getCategory() != ObjectInspector.Category.LIST) {
        throw new UDFArgumentTypeException(0, "Only list type arguments are accepted but " + parameters[0].getTypeName() + " was passed as parameter 1.");
    }
    switch(((ListTypeInfo) parameters[0]).getListElementTypeInfo().getCategory()) {
        case PRIMITIVE:
            // Parameter 1 was an array of primitives, so make sure the primitives are strings.
            pti = (PrimitiveTypeInfo) ((ListTypeInfo) parameters[0]).getListElementTypeInfo();
            break;
        case LIST:
            // Parameter 1 was an array of arrays, so make sure that the inner arrays contain
            // primitive strings.
            ListTypeInfo lti = (ListTypeInfo) ((ListTypeInfo) parameters[0]).getListElementTypeInfo();
            pti = (PrimitiveTypeInfo) lti.getListElementTypeInfo();
            break;
        default:
            throw new UDFArgumentTypeException(0, "Only arrays of strings or arrays of arrays of strings are accepted but " + parameters[0].getTypeName() + " was passed as parameter 1.");
    }
    if (pti.getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.STRING) {
        throw new UDFArgumentTypeException(0, "Only array<string> or array<array<string>> is allowed, but " + parameters[0].getTypeName() + " was passed as parameter 1.");
    }
    // Validate the second parameter, which should be an array of strings
    if (parameters[1].getCategory() != ObjectInspector.Category.LIST || ((ListTypeInfo) parameters[1]).getListElementTypeInfo().getCategory() != ObjectInspector.Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(1, "Only arrays of strings are accepted but " + parameters[1].getTypeName() + " was passed as parameter 2.");
    }
    if (((PrimitiveTypeInfo) ((ListTypeInfo) parameters[1]).getListElementTypeInfo()).getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.STRING) {
        throw new UDFArgumentTypeException(1, "Only arrays of strings are accepted but " + parameters[1].getTypeName() + " was passed as parameter 2.");
    }
    // Validate the third parameter, which should be an integer to represent 'k'
    if (parameters[2].getCategory() != ObjectInspector.Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(2, "Only integers are accepted but " + parameters[2].getTypeName() + " was passed as parameter 3.");
    }
    switch(((PrimitiveTypeInfo) parameters[2]).getPrimitiveCategory()) {
        case BYTE:
        case SHORT:
        case INT:
        case LONG:
        case TIMESTAMP:
            break;
        default:
            throw new UDFArgumentTypeException(2, "Only integers are accepted but " + parameters[2].getTypeName() + " was passed as parameter 3.");
    }
    // an integer.
    if (parameters.length == 4) {
        if (parameters[3].getCategory() != ObjectInspector.Category.PRIMITIVE) {
            throw new UDFArgumentTypeException(3, "Only integers are accepted but " + parameters[3].getTypeName() + " was passed as parameter 4.");
        }
        switch(((PrimitiveTypeInfo) parameters[3]).getPrimitiveCategory()) {
            case BYTE:
            case SHORT:
            case INT:
            case LONG:
            case TIMESTAMP:
                break;
            default:
                throw new UDFArgumentTypeException(3, "Only integers are accepted but " + parameters[3].getTypeName() + " was passed as parameter 4.");
        }
    }
    return new GenericUDAFContextNGramEvaluator();
}
Also used : ListTypeInfo(org.apache.hadoop.hive.serde2.typeinfo.ListTypeInfo) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) PrimitiveTypeInfo(org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo)

Example 25 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDAFLeadLag method getEvaluator.

@Override
public GenericUDAFEvaluator getEvaluator(GenericUDAFParameterInfo parameters) throws SemanticException {
    ObjectInspector[] paramOIs = parameters.getParameterObjectInspectors();
    String fNm = functionName();
    if (!(paramOIs.length >= 1 && paramOIs.length <= 3)) {
        throw new UDFArgumentTypeException(paramOIs.length - 1, "Incorrect invocation of " + fNm + ": _FUNC_(expr, amt, default)");
    }
    int amt = 1;
    if (paramOIs.length > 1) {
        ObjectInspector amtOI = paramOIs[1];
        if (!ObjectInspectorUtils.isConstantObjectInspector(amtOI) || (amtOI.getCategory() != ObjectInspector.Category.PRIMITIVE) || ((PrimitiveObjectInspector) amtOI).getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.INT) {
            throw new UDFArgumentTypeException(1, fNm + " amount must be a integer value " + amtOI.getTypeName() + " was passed as parameter 1.");
        }
        Object o = ((ConstantObjectInspector) amtOI).getWritableConstantValue();
        amt = ((IntWritable) o).get();
        if (amt < 0) {
            throw new UDFArgumentTypeException(1, fNm + " amount can not be nagative. Specified: " + amt);
        }
    }
    if (paramOIs.length == 3) {
        ObjectInspectorConverters.getConverter(paramOIs[2], paramOIs[0]);
    }
    GenericUDAFLeadLagEvaluator eval = createLLEvaluator();
    eval.setAmt(amt);
    return eval;
}
Also used : PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector)

Aggregations

UDFArgumentTypeException (org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)56 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)29 PrimitiveObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)28 UDFArgumentLengthException (org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException)20 ConstantObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector)15 UDFArgumentException (org.apache.hadoop.hive.ql.exec.UDFArgumentException)14 PrimitiveCategory (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory)12 Category (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category)8 ListObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector)5 PrimitiveTypeInfo (org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo)5 Converter (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.Converter)4 ParseException (java.text.ParseException)3 ArrayList (java.util.ArrayList)3 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)3 MapObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.MapObjectInspector)3 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)3 TimestampConverter (org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.TimestampConverter)3 VoidObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.primitive.VoidObjectInspector)3 Timestamp (java.sql.Timestamp)2 Date (java.util.Date)2