Search in sources :

Example 1 with MapType

use of io.trino.spi.type.MapType in project trino by trinodb.

the class SessionPropertyManager method getJsonCodecForType.

private static <T> JsonCodec<T> getJsonCodecForType(Type type) {
    if (VarcharType.VARCHAR.equals(type)) {
        return (JsonCodec<T>) JSON_CODEC_FACTORY.jsonCodec(String.class);
    }
    if (BooleanType.BOOLEAN.equals(type)) {
        return (JsonCodec<T>) JSON_CODEC_FACTORY.jsonCodec(Boolean.class);
    }
    if (BigintType.BIGINT.equals(type)) {
        return (JsonCodec<T>) JSON_CODEC_FACTORY.jsonCodec(Long.class);
    }
    if (IntegerType.INTEGER.equals(type)) {
        return (JsonCodec<T>) JSON_CODEC_FACTORY.jsonCodec(Integer.class);
    }
    if (DoubleType.DOUBLE.equals(type)) {
        return (JsonCodec<T>) JSON_CODEC_FACTORY.jsonCodec(Double.class);
    }
    if (type instanceof ArrayType) {
        Type elementType = ((ArrayType) type).getElementType();
        return (JsonCodec<T>) JSON_CODEC_FACTORY.listJsonCodec(getJsonCodecForType(elementType));
    }
    if (type instanceof MapType) {
        Type keyType = ((MapType) type).getKeyType();
        Type valueType = ((MapType) type).getValueType();
        return (JsonCodec<T>) JSON_CODEC_FACTORY.mapJsonCodec(getMapKeyType(keyType), getJsonCodecForType(valueType));
    }
    throw new TrinoException(INVALID_SESSION_PROPERTY, format("Session property type %s is not supported", type));
}
Also used : ArrayType(io.trino.spi.type.ArrayType) JsonCodec(io.airlift.json.JsonCodec) DoubleType(io.trino.spi.type.DoubleType) Type(io.trino.spi.type.Type) BooleanType(io.trino.spi.type.BooleanType) BigintType(io.trino.spi.type.BigintType) VarcharType(io.trino.spi.type.VarcharType) IntegerType(io.trino.spi.type.IntegerType) MapType(io.trino.spi.type.MapType) ArrayType(io.trino.spi.type.ArrayType) TrinoException(io.trino.spi.TrinoException) MapType(io.trino.spi.type.MapType)

Example 2 with MapType

use of io.trino.spi.type.MapType in project trino by trinodb.

the class MapUnionAggregation method specialize.

@Override
public AggregationMetadata specialize(BoundSignature boundSignature) {
    MapType outputType = (MapType) boundSignature.getReturnType();
    Type keyType = outputType.getKeyType();
    BlockPositionEqual keyEqual = blockTypeOperators.getEqualOperator(keyType);
    BlockPositionHashCode keyHashCode = blockTypeOperators.getHashCodeOperator(keyType);
    Type valueType = outputType.getValueType();
    KeyValuePairStateSerializer stateSerializer = new KeyValuePairStateSerializer(outputType, keyEqual, keyHashCode);
    MethodHandle inputFunction = MethodHandles.insertArguments(INPUT_FUNCTION, 0, keyType, keyEqual, keyHashCode, valueType);
    inputFunction = normalizeInputMethod(inputFunction, boundSignature, STATE, INPUT_CHANNEL);
    return new AggregationMetadata(inputFunction, Optional.empty(), Optional.of(COMBINE_FUNCTION), OUTPUT_FUNCTION, ImmutableList.of(new AccumulatorStateDescriptor<>(KeyValuePairsState.class, stateSerializer, new KeyValuePairsStateFactory(keyType, valueType))));
}
Also used : BlockPositionEqual(io.trino.type.BlockTypeOperators.BlockPositionEqual) Type(io.trino.spi.type.Type) MapType(io.trino.spi.type.MapType) TypeSignature.mapType(io.trino.spi.type.TypeSignature.mapType) KeyValuePairStateSerializer(io.trino.operator.aggregation.state.KeyValuePairStateSerializer) AccumulatorStateDescriptor(io.trino.operator.aggregation.AggregationMetadata.AccumulatorStateDescriptor) BlockPositionHashCode(io.trino.type.BlockTypeOperators.BlockPositionHashCode) KeyValuePairsStateFactory(io.trino.operator.aggregation.state.KeyValuePairsStateFactory) MapType(io.trino.spi.type.MapType) MethodHandle(java.lang.invoke.MethodHandle)

Example 3 with MapType

use of io.trino.spi.type.MapType in project trino by trinodb.

the class MapAggregationFunction method specialize.

@Override
public AggregationMetadata specialize(BoundSignature boundSignature) {
    MapType outputType = (MapType) boundSignature.getReturnType();
    Type keyType = outputType.getKeyType();
    BlockPositionEqual keyEqual = blockTypeOperators.getEqualOperator(keyType);
    BlockPositionHashCode keyHashCode = blockTypeOperators.getHashCodeOperator(keyType);
    Type valueType = outputType.getValueType();
    KeyValuePairStateSerializer stateSerializer = new KeyValuePairStateSerializer(outputType, keyEqual, keyHashCode);
    return new AggregationMetadata(MethodHandles.insertArguments(INPUT_FUNCTION, 0, keyType, keyEqual, keyHashCode, valueType), Optional.empty(), Optional.of(COMBINE_FUNCTION), OUTPUT_FUNCTION, ImmutableList.of(new AccumulatorStateDescriptor<>(KeyValuePairsState.class, stateSerializer, new KeyValuePairsStateFactory(keyType, valueType))));
}
Also used : BlockPositionEqual(io.trino.type.BlockTypeOperators.BlockPositionEqual) Type(io.trino.spi.type.Type) MapType(io.trino.spi.type.MapType) TypeSignature.mapType(io.trino.spi.type.TypeSignature.mapType) KeyValuePairStateSerializer(io.trino.operator.aggregation.state.KeyValuePairStateSerializer) AccumulatorStateDescriptor(io.trino.operator.aggregation.AggregationMetadata.AccumulatorStateDescriptor) BlockPositionHashCode(io.trino.type.BlockTypeOperators.BlockPositionHashCode) KeyValuePairsStateFactory(io.trino.operator.aggregation.state.KeyValuePairsStateFactory) MapType(io.trino.spi.type.MapType)

Example 4 with MapType

use of io.trino.spi.type.MapType in project trino by trinodb.

the class HiveParquetColumnIOConverter method constructField.

public static Optional<Field> constructField(Type type, ColumnIO columnIO) {
    if (columnIO == null) {
        return Optional.empty();
    }
    boolean required = columnIO.getType().getRepetition() != OPTIONAL;
    int repetitionLevel = columnRepetitionLevel(columnIO);
    int definitionLevel = columnDefinitionLevel(columnIO);
    if (type instanceof RowType) {
        RowType rowType = (RowType) type;
        GroupColumnIO groupColumnIO = (GroupColumnIO) columnIO;
        ImmutableList.Builder<Optional<Field>> fieldsBuilder = ImmutableList.builder();
        List<RowType.Field> fields = rowType.getFields();
        boolean structHasParameters = false;
        for (int i = 0; i < fields.size(); i++) {
            RowType.Field rowField = fields.get(i);
            String name = rowField.getName().orElseThrow().toLowerCase(Locale.ENGLISH);
            Optional<Field> field = constructField(rowField.getType(), lookupColumnByName(groupColumnIO, name));
            structHasParameters |= field.isPresent();
            fieldsBuilder.add(field);
        }
        if (structHasParameters) {
            return Optional.of(new GroupField(type, repetitionLevel, definitionLevel, required, fieldsBuilder.build()));
        }
        return Optional.empty();
    }
    if (type instanceof MapType) {
        MapType mapType = (MapType) type;
        GroupColumnIO groupColumnIO = (GroupColumnIO) columnIO;
        GroupColumnIO keyValueColumnIO = getMapKeyValueColumn(groupColumnIO);
        if (keyValueColumnIO.getChildrenCount() != 2) {
            return Optional.empty();
        }
        Optional<Field> keyField = constructField(mapType.getKeyType(), keyValueColumnIO.getChild(0));
        Optional<Field> valueField = constructField(mapType.getValueType(), keyValueColumnIO.getChild(1));
        return Optional.of(new GroupField(type, repetitionLevel, definitionLevel, required, ImmutableList.of(keyField, valueField)));
    }
    if (type instanceof ArrayType) {
        ArrayType arrayType = (ArrayType) type;
        GroupColumnIO groupColumnIO = (GroupColumnIO) columnIO;
        if (groupColumnIO.getChildrenCount() != 1) {
            return Optional.empty();
        }
        Optional<Field> field = constructField(arrayType.getElementType(), getArrayElementColumn(groupColumnIO.getChild(0)));
        return Optional.of(new GroupField(type, repetitionLevel, definitionLevel, required, ImmutableList.of(field)));
    }
    PrimitiveColumnIO primitiveColumnIO = (PrimitiveColumnIO) columnIO;
    RichColumnDescriptor column = new RichColumnDescriptor(primitiveColumnIO.getColumnDescriptor(), columnIO.getType().asPrimitiveType());
    return Optional.of(new PrimitiveField(type, repetitionLevel, definitionLevel, required, column, primitiveColumnIO.getId()));
}
Also used : Optional(java.util.Optional) ImmutableList(com.google.common.collect.ImmutableList) GroupField(io.trino.parquet.GroupField) RichColumnDescriptor(io.trino.parquet.RichColumnDescriptor) RowType(io.trino.spi.type.RowType) MapType(io.trino.spi.type.MapType) PrimitiveColumnIO(org.apache.parquet.io.PrimitiveColumnIO) ArrayType(io.trino.spi.type.ArrayType) GroupField(io.trino.parquet.GroupField) PrimitiveField(io.trino.parquet.PrimitiveField) Field(io.trino.parquet.Field) GroupColumnIO(org.apache.parquet.io.GroupColumnIO) PrimitiveField(io.trino.parquet.PrimitiveField)

Example 5 with MapType

use of io.trino.spi.type.MapType in project trino by trinodb.

the class HiveWriteUtils method getField.

public static Object getField(DateTimeZone localZone, Type type, Block block, int position) {
    if (block.isNull(position)) {
        return null;
    }
    if (BOOLEAN.equals(type)) {
        return type.getBoolean(block, position);
    }
    if (BIGINT.equals(type)) {
        return type.getLong(block, position);
    }
    if (INTEGER.equals(type)) {
        return toIntExact(type.getLong(block, position));
    }
    if (SMALLINT.equals(type)) {
        return Shorts.checkedCast(type.getLong(block, position));
    }
    if (TINYINT.equals(type)) {
        return SignedBytes.checkedCast(type.getLong(block, position));
    }
    if (REAL.equals(type)) {
        return intBitsToFloat((int) type.getLong(block, position));
    }
    if (DOUBLE.equals(type)) {
        return type.getDouble(block, position);
    }
    if (type instanceof VarcharType) {
        return new Text(type.getSlice(block, position).getBytes());
    }
    if (type instanceof CharType) {
        CharType charType = (CharType) type;
        return new Text(padSpaces(type.getSlice(block, position), charType).toStringUtf8());
    }
    if (VARBINARY.equals(type)) {
        return type.getSlice(block, position).getBytes();
    }
    if (DATE.equals(type)) {
        return Date.ofEpochDay(toIntExact(type.getLong(block, position)));
    }
    if (type instanceof TimestampType) {
        return getHiveTimestamp(localZone, (TimestampType) type, block, position);
    }
    if (type instanceof DecimalType) {
        DecimalType decimalType = (DecimalType) type;
        return getHiveDecimal(decimalType, block, position);
    }
    if (type instanceof ArrayType) {
        Type elementType = ((ArrayType) type).getElementType();
        Block arrayBlock = block.getObject(position, Block.class);
        List<Object> list = new ArrayList<>(arrayBlock.getPositionCount());
        for (int i = 0; i < arrayBlock.getPositionCount(); i++) {
            list.add(getField(localZone, elementType, arrayBlock, i));
        }
        return unmodifiableList(list);
    }
    if (type instanceof MapType) {
        Type keyType = ((MapType) type).getKeyType();
        Type valueType = ((MapType) type).getValueType();
        Block mapBlock = block.getObject(position, Block.class);
        Map<Object, Object> map = new HashMap<>();
        for (int i = 0; i < mapBlock.getPositionCount(); i += 2) {
            map.put(getField(localZone, keyType, mapBlock, i), getField(localZone, valueType, mapBlock, i + 1));
        }
        return unmodifiableMap(map);
    }
    if (type instanceof RowType) {
        List<Type> fieldTypes = type.getTypeParameters();
        Block rowBlock = block.getObject(position, Block.class);
        checkCondition(fieldTypes.size() == rowBlock.getPositionCount(), StandardErrorCode.GENERIC_INTERNAL_ERROR, "Expected row value field count does not match type field count");
        List<Object> row = new ArrayList<>(rowBlock.getPositionCount());
        for (int i = 0; i < rowBlock.getPositionCount(); i++) {
            row.add(getField(localZone, fieldTypes.get(i), rowBlock, i));
        }
        return unmodifiableList(row);
    }
    throw new TrinoException(NOT_SUPPORTED, "unsupported type: " + type);
}
Also used : VarcharType(io.trino.spi.type.VarcharType) HashMap(java.util.HashMap) ArrayList(java.util.ArrayList) HiveUtil.isRowType(io.trino.plugin.hive.util.HiveUtil.isRowType) RowType(io.trino.spi.type.RowType) Text(org.apache.hadoop.io.Text) HiveUtil.isMapType(io.trino.plugin.hive.util.HiveUtil.isMapType) MapType(io.trino.spi.type.MapType) ArrayType(io.trino.spi.type.ArrayType) HiveUtil.isArrayType(io.trino.plugin.hive.util.HiveUtil.isArrayType) HiveUtil.isMapType(io.trino.plugin.hive.util.HiveUtil.isMapType) HiveUtil.isRowType(io.trino.plugin.hive.util.HiveUtil.isRowType) TimestampType(io.trino.spi.type.TimestampType) HiveType(io.trino.plugin.hive.HiveType) MapType(io.trino.spi.type.MapType) CharType(io.trino.spi.type.CharType) RowType(io.trino.spi.type.RowType) ArrayType(io.trino.spi.type.ArrayType) DecimalType(io.trino.spi.type.DecimalType) Type(io.trino.spi.type.Type) VarcharType(io.trino.spi.type.VarcharType) HiveUtil.isArrayType(io.trino.plugin.hive.util.HiveUtil.isArrayType) TimestampType(io.trino.spi.type.TimestampType) DecimalType(io.trino.spi.type.DecimalType) Block(io.trino.spi.block.Block) TrinoException(io.trino.spi.TrinoException) CharType(io.trino.spi.type.CharType)

Aggregations

MapType (io.trino.spi.type.MapType)85 Type (io.trino.spi.type.Type)45 ArrayType (io.trino.spi.type.ArrayType)42 RowType (io.trino.spi.type.RowType)38 BlockBuilder (io.trino.spi.block.BlockBuilder)28 VarcharType (io.trino.spi.type.VarcharType)26 Block (io.trino.spi.block.Block)17 DecimalType (io.trino.spi.type.DecimalType)17 Map (java.util.Map)17 Test (org.testng.annotations.Test)17 ImmutableList (com.google.common.collect.ImmutableList)16 List (java.util.List)14 TypeSignature.mapType (io.trino.spi.type.TypeSignature.mapType)13 CharType (io.trino.spi.type.CharType)12 ArrayList (java.util.ArrayList)12 HashMap (java.util.HashMap)11 TypeOperators (io.trino.spi.type.TypeOperators)10 ImmutableMap (com.google.common.collect.ImmutableMap)9 VarbinaryType (io.trino.spi.type.VarbinaryType)8 Collectors.toList (java.util.stream.Collectors.toList)8