Search in sources :

Example 16 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFSortArrayByField method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    GenericUDFUtils.ReturnObjectInspectorResolver returnOIResolver;
    returnOIResolver = new GenericUDFUtils.ReturnObjectInspectorResolver(true);
    /**This UDF requires minimum 2 arguments array_name,field name*/
    if (arguments.length < 2) {
        throw new UDFArgumentLengthException("SORT_ARRAY_BY requires minimum 2 arguments, got " + arguments.length);
    }
    /**First argument must be array*/
    switch(arguments[0].getCategory()) {
        case LIST:
            listObjectInspector = (ListObjectInspector) arguments[0];
            break;
        default:
            throw new UDFArgumentTypeException(0, "Argument 1 of function SORT_ARRAY_BY must be " + serdeConstants.LIST_TYPE_NAME + ", but " + arguments[0].getTypeName() + " was found.");
    }
    /**Elements inside first argument(array) must be tuple(s)*/
    switch(listObjectInspector.getListElementObjectInspector().getCategory()) {
        case STRUCT:
            structObjectInspector = (StructObjectInspector) listObjectInspector.getListElementObjectInspector();
            break;
        default:
            throw new UDFArgumentTypeException(0, "Element[s] of first argument array in function SORT_ARRAY_BY must be " + serdeConstants.STRUCT_TYPE_NAME + ", but " + listObjectInspector.getTypeName() + " was found.");
    }
    /**All sort fields argument name and sort order name must be in String type*/
    converters = new Converter[arguments.length];
    inputTypes = new PrimitiveCategory[arguments.length];
    fields = new StructField[arguments.length - 1];
    noOfInputFields = arguments.length - 1;
    for (int i = 1; i < arguments.length; i++) {
        checkArgPrimitive(arguments, i);
        checkArgGroups(arguments, i, inputTypes, PrimitiveGrouping.STRING_GROUP);
        if (arguments[i] instanceof ConstantObjectInspector) {
            String fieldName = getConstantStringValue(arguments, i);
            /**checking whether any sorting order (ASC,DESC) has specified in last argument*/
            if (i != 1 && (i == arguments.length - 1) && (fieldName.trim().toUpperCase().equals(SORT_ORDER_TYPE.ASC.name()) || fieldName.trim().toUpperCase().equals(SORT_ORDER_TYPE.DESC.name()))) {
                sortOrder = SORT_ORDER_TYPE.valueOf(fieldName.trim().toUpperCase());
                noOfInputFields -= 1;
                continue;
            }
            fields[i - 1] = structObjectInspector.getStructFieldRef(getConstantStringValue(arguments, i));
        }
        obtainStringConverter(arguments, i, inputTypes, converters);
    }
    ObjectInspector returnOI = returnOIResolver.get(structObjectInspector);
    converters[0] = ObjectInspectorConverters.getConverter(structObjectInspector, returnOI);
    return ObjectInspectorFactory.getStandardListObjectInspector(structObjectInspector);
}
Also used : ListObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector) StructObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) ConstantObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector)

Example 17 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFMapValues method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 1) {
        throw new UDFArgumentLengthException("The function MAP_VALUES only accepts 1 argument.");
    } else if (!(arguments[0] instanceof MapObjectInspector)) {
        throw new UDFArgumentTypeException(0, "\"" + Category.MAP.toString().toLowerCase() + "\" is expected at function MAP_VALUES, " + "but \"" + arguments[0].getTypeName() + "\" is found");
    }
    mapOI = (MapObjectInspector) arguments[0];
    ObjectInspector mapValueOI = mapOI.getMapValueObjectInspector();
    return ObjectInspectorFactory.getStandardListObjectInspector(mapValueOI);
}
Also used : MapObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.MapObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) MapObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.MapObjectInspector) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)

Example 18 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFOPDTIPlus method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 2) {
        throw new UDFArgumentException(opName + " requires two arguments.");
    }
    PrimitiveObjectInspector resultOI = null;
    for (int i = 0; i < 2; i++) {
        Category category = arguments[i].getCategory();
        if (category != Category.PRIMITIVE) {
            throw new UDFArgumentTypeException(i, "The " + GenericUDFUtils.getOrdinal(i + 1) + " argument of " + opName + "  is expected to a " + Category.PRIMITIVE.toString().toLowerCase() + " type, but " + category.toString().toLowerCase() + " is found");
        }
    }
    inputOIs = new PrimitiveObjectInspector[] { (PrimitiveObjectInspector) arguments[0], (PrimitiveObjectInspector) arguments[1] };
    PrimitiveObjectInspector leftOI = inputOIs[0];
    PrimitiveObjectInspector rightOI = inputOIs[1];
    // IntervalDayTime + Timestamp = Timestamp (operands reversible)
    if (checkArgs(PrimitiveCategory.INTERVAL_YEAR_MONTH, PrimitiveCategory.INTERVAL_YEAR_MONTH)) {
        plusOpType = OperationType.INTERVALYM_PLUS_INTERVALYM;
        intervalArg1Idx = 0;
        intervalArg2Idx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.intervalYearMonthTypeInfo);
    } else if (checkArgs(PrimitiveCategory.DATE, PrimitiveCategory.INTERVAL_YEAR_MONTH)) {
        plusOpType = OperationType.INTERVALYM_PLUS_DATE;
        dtArgIdx = 0;
        intervalArg1Idx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.dateTypeInfo);
    } else if (checkArgs(PrimitiveCategory.INTERVAL_YEAR_MONTH, PrimitiveCategory.DATE)) {
        plusOpType = OperationType.INTERVALYM_PLUS_DATE;
        intervalArg1Idx = 0;
        dtArgIdx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.dateTypeInfo);
    } else if (checkArgs(PrimitiveCategory.TIMESTAMP, PrimitiveCategory.INTERVAL_YEAR_MONTH)) {
        plusOpType = OperationType.INTERVALYM_PLUS_TIMESTAMP;
        dtArgIdx = 0;
        intervalArg1Idx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.timestampTypeInfo);
    } else if (checkArgs(PrimitiveCategory.INTERVAL_YEAR_MONTH, PrimitiveCategory.TIMESTAMP)) {
        plusOpType = OperationType.INTERVALYM_PLUS_TIMESTAMP;
        intervalArg1Idx = 0;
        dtArgIdx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.timestampTypeInfo);
    } else if (checkArgs(PrimitiveCategory.INTERVAL_DAY_TIME, PrimitiveCategory.INTERVAL_DAY_TIME)) {
        plusOpType = OperationType.INTERVALDT_PLUS_INTERVALDT;
        intervalArg1Idx = 0;
        intervalArg2Idx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.intervalDayTimeTypeInfo);
    } else if (checkArgs(PrimitiveCategory.INTERVAL_DAY_TIME, PrimitiveCategory.DATE) || checkArgs(PrimitiveCategory.INTERVAL_DAY_TIME, PrimitiveCategory.TIMESTAMP)) {
        plusOpType = OperationType.INTERVALDT_PLUS_TIMESTAMP;
        intervalArg1Idx = 0;
        dtArgIdx = 1;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.timestampTypeInfo);
        dtConverter = ObjectInspectorConverters.getConverter(leftOI, resultOI);
    } else if (checkArgs(PrimitiveCategory.DATE, PrimitiveCategory.INTERVAL_DAY_TIME) || checkArgs(PrimitiveCategory.TIMESTAMP, PrimitiveCategory.INTERVAL_DAY_TIME)) {
        plusOpType = OperationType.INTERVALDT_PLUS_TIMESTAMP;
        intervalArg1Idx = 1;
        dtArgIdx = 0;
        resultOI = PrimitiveObjectInspectorFactory.getPrimitiveWritableObjectInspector(TypeInfoFactory.timestampTypeInfo);
        dtConverter = ObjectInspectorConverters.getConverter(leftOI, resultOI);
    } else {
        // Unsupported types - error
        List<TypeInfo> argTypeInfos = new ArrayList<TypeInfo>(2);
        argTypeInfos.add(leftOI.getTypeInfo());
        argTypeInfos.add(rightOI.getTypeInfo());
        throw new NoMatchingMethodException(this.getClass(), argTypeInfos, null);
    }
    return resultOI;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) NoMatchingMethodException(org.apache.hadoop.hive.ql.exec.NoMatchingMethodException) PrimitiveCategory(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory) Category(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ArrayList(java.util.ArrayList) List(java.util.List) TypeInfo(org.apache.hadoop.hive.serde2.typeinfo.TypeInfo)

Example 19 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDAFSumList method getEvaluator.

@Override
public GenericUDAFEvaluator getEvaluator(GenericUDAFParameterInfo info) throws SemanticException {
    ObjectInspector[] inspectors = info.getParameterObjectInspectors();
    if (inspectors.length != 1) {
        throw new UDFArgumentTypeException(inspectors.length - 1, "Exactly one argument is expected.");
    }
    if (inspectors[0].getCategory() != ObjectInspector.Category.LIST) {
        throw new UDFArgumentTypeException(0, "Argument should be a list type");
    }
    ListObjectInspector listOI = (ListObjectInspector) inspectors[0];
    ObjectInspector elementOI = listOI.getListElementObjectInspector();
    if (elementOI.getCategory() != ObjectInspector.Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(0, "Only primitive type arguments are accepted but " + elementOI.getTypeName() + " is passed.");
    }
    PrimitiveObjectInspector.PrimitiveCategory pcat = ((PrimitiveObjectInspector) elementOI).getPrimitiveCategory();
    return new GenericUDAFSumLong();
}
Also used : ListObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) ListObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)

Example 20 with UDFArgumentTypeException

use of org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException in project hive by apache.

the class GenericUDFDateAdd method initialize.

@Override
public ObjectInspector initialize(ObjectInspector[] arguments) throws UDFArgumentException {
    if (arguments.length != 2) {
        throw new UDFArgumentLengthException("date_add() requires 2 argument, got " + arguments.length);
    }
    if (arguments[0].getCategory() != ObjectInspector.Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(0, "Only primitive type arguments are accepted but " + arguments[0].getTypeName() + " is passed. as first arguments");
    }
    if (arguments[1].getCategory() != ObjectInspector.Category.PRIMITIVE) {
        throw new UDFArgumentTypeException(1, "Only primitive type arguments are accepted but " + arguments[1].getTypeName() + " is passed. as second arguments");
    }
    inputType1 = ((PrimitiveObjectInspector) arguments[0]).getPrimitiveCategory();
    ObjectInspector outputOI = PrimitiveObjectInspectorFactory.writableDateObjectInspector;
    switch(inputType1) {
        case STRING:
        case VARCHAR:
        case CHAR:
            inputType1 = PrimitiveCategory.STRING;
            dateConverter = ObjectInspectorConverters.getConverter((PrimitiveObjectInspector) arguments[0], PrimitiveObjectInspectorFactory.writableStringObjectInspector);
            break;
        case TIMESTAMP:
            dateConverter = new TimestampConverter((PrimitiveObjectInspector) arguments[0], PrimitiveObjectInspectorFactory.writableTimestampObjectInspector);
            break;
        case DATE:
            dateConverter = ObjectInspectorConverters.getConverter((PrimitiveObjectInspector) arguments[0], PrimitiveObjectInspectorFactory.writableDateObjectInspector);
            break;
        default:
            throw new UDFArgumentException(" DATE_ADD() only takes STRING/TIMESTAMP/DATEWRITABLE types as first argument, got " + inputType1);
    }
    inputType2 = ((PrimitiveObjectInspector) arguments[1]).getPrimitiveCategory();
    switch(inputType2) {
        case BYTE:
            daysConverter = ObjectInspectorConverters.getConverter((PrimitiveObjectInspector) arguments[1], PrimitiveObjectInspectorFactory.writableByteObjectInspector);
            break;
        case SHORT:
            daysConverter = ObjectInspectorConverters.getConverter((PrimitiveObjectInspector) arguments[1], PrimitiveObjectInspectorFactory.writableShortObjectInspector);
            break;
        case INT:
            daysConverter = ObjectInspectorConverters.getConverter((PrimitiveObjectInspector) arguments[1], PrimitiveObjectInspectorFactory.writableIntObjectInspector);
            break;
        default:
            throw new UDFArgumentException(" DATE_ADD() only takes TINYINT/SMALLINT/INT types as second argument, got " + inputType2);
    }
    return outputOI;
}
Also used : UDFArgumentException(org.apache.hadoop.hive.ql.exec.UDFArgumentException) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector) ObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector) UDFArgumentLengthException(org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException) UDFArgumentTypeException(org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException) TimestampConverter(org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.TimestampConverter) PrimitiveObjectInspector(org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)

Aggregations

UDFArgumentTypeException (org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException)56 ObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector)29 PrimitiveObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector)28 UDFArgumentLengthException (org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException)20 ConstantObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ConstantObjectInspector)15 UDFArgumentException (org.apache.hadoop.hive.ql.exec.UDFArgumentException)14 PrimitiveCategory (org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory)12 Category (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category)8 ListObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector)5 PrimitiveTypeInfo (org.apache.hadoop.hive.serde2.typeinfo.PrimitiveTypeInfo)5 Converter (org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorConverters.Converter)4 ParseException (java.text.ParseException)3 ArrayList (java.util.ArrayList)3 HiveException (org.apache.hadoop.hive.ql.metadata.HiveException)3 MapObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.MapObjectInspector)3 StructObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector)3 TimestampConverter (org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter.TimestampConverter)3 VoidObjectInspector (org.apache.hadoop.hive.serde2.objectinspector.primitive.VoidObjectInspector)3 Timestamp (java.sql.Timestamp)2 Date (java.util.Date)2