Search in sources :

Example 46 with MinorType

use of org.apache.drill.common.types.TypeProtos.MinorType in project drill by apache.

the class FragmentContext method getConstantValueHolder.

@Override
public ValueHolder getConstantValueHolder(String value, MinorType type, Function<DrillBuf, ValueHolder> holderInitializer) {
    if (!constantValueHolderCache.containsKey(value)) {
        constantValueHolderCache.put(value, Maps.<MinorType, ValueHolder>newHashMap());
    }
    Map<MinorType, ValueHolder> holdersByType = constantValueHolderCache.get(value);
    ValueHolder valueHolder = holdersByType.get(type);
    if (valueHolder == null) {
        valueHolder = holderInitializer.apply(getManagedBuffer());
        holdersByType.put(type, valueHolder);
    }
    return valueHolder;
}
Also used : MinorType(org.apache.drill.common.types.TypeProtos.MinorType) ValueHolder(org.apache.drill.exec.expr.holders.ValueHolder)

Example 47 with MinorType

use of org.apache.drill.common.types.TypeProtos.MinorType in project drill by axbaretto.

the class MapUtility method writeToMapFromReader.

/*
   * Function to read a value from the field reader, detect the type, construct the appropriate value holder
   * and use the value holder to write to the Map.
   */
// TODO : This should be templatized and generated using freemarker
public static void writeToMapFromReader(FieldReader fieldReader, BaseWriter.MapWriter mapWriter) {
    try {
        MajorType valueMajorType = fieldReader.getType();
        MinorType valueMinorType = valueMajorType.getMinorType();
        boolean repeated = false;
        if (valueMajorType.getMode() == TypeProtos.DataMode.REPEATED) {
            repeated = true;
        }
        switch(valueMinorType) {
            case TINYINT:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).tinyInt());
                } else {
                    fieldReader.copyAsValue(mapWriter.tinyInt(MappifyUtility.fieldValue));
                }
                break;
            case SMALLINT:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).smallInt());
                } else {
                    fieldReader.copyAsValue(mapWriter.smallInt(MappifyUtility.fieldValue));
                }
                break;
            case BIGINT:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).bigInt());
                } else {
                    fieldReader.copyAsValue(mapWriter.bigInt(MappifyUtility.fieldValue));
                }
                break;
            case INT:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).integer());
                } else {
                    fieldReader.copyAsValue(mapWriter.integer(MappifyUtility.fieldValue));
                }
                break;
            case UINT1:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).uInt1());
                } else {
                    fieldReader.copyAsValue(mapWriter.uInt1(MappifyUtility.fieldValue));
                }
                break;
            case UINT2:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).uInt2());
                } else {
                    fieldReader.copyAsValue(mapWriter.uInt2(MappifyUtility.fieldValue));
                }
                break;
            case UINT4:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).uInt4());
                } else {
                    fieldReader.copyAsValue(mapWriter.uInt4(MappifyUtility.fieldValue));
                }
                break;
            case UINT8:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).uInt8());
                } else {
                    fieldReader.copyAsValue(mapWriter.uInt8(MappifyUtility.fieldValue));
                }
                break;
            case DECIMAL9:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).decimal9());
                } else {
                    fieldReader.copyAsValue(mapWriter.decimal9(MappifyUtility.fieldValue));
                }
                break;
            case DECIMAL18:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).decimal18());
                } else {
                    fieldReader.copyAsValue(mapWriter.decimal18(MappifyUtility.fieldValue));
                }
                break;
            case DECIMAL28SPARSE:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).decimal28Sparse());
                } else {
                    fieldReader.copyAsValue(mapWriter.decimal28Sparse(MappifyUtility.fieldValue));
                }
                break;
            case DECIMAL38SPARSE:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).decimal38Sparse());
                } else {
                    fieldReader.copyAsValue(mapWriter.decimal38Sparse(MappifyUtility.fieldValue));
                }
                break;
            case DATE:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).date());
                } else {
                    fieldReader.copyAsValue(mapWriter.date(MappifyUtility.fieldValue));
                }
                break;
            case TIME:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).time());
                } else {
                    fieldReader.copyAsValue(mapWriter.time(MappifyUtility.fieldValue));
                }
                break;
            case TIMESTAMP:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).timeStamp());
                } else {
                    fieldReader.copyAsValue(mapWriter.timeStamp(MappifyUtility.fieldValue));
                }
                break;
            case INTERVAL:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).interval());
                } else {
                    fieldReader.copyAsValue(mapWriter.interval(MappifyUtility.fieldValue));
                }
                break;
            case INTERVALDAY:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).intervalDay());
                } else {
                    fieldReader.copyAsValue(mapWriter.intervalDay(MappifyUtility.fieldValue));
                }
                break;
            case INTERVALYEAR:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).intervalYear());
                } else {
                    fieldReader.copyAsValue(mapWriter.intervalYear(MappifyUtility.fieldValue));
                }
                break;
            case FLOAT4:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).float4());
                } else {
                    fieldReader.copyAsValue(mapWriter.float4(MappifyUtility.fieldValue));
                }
                break;
            case FLOAT8:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).float8());
                } else {
                    fieldReader.copyAsValue(mapWriter.float8(MappifyUtility.fieldValue));
                }
                break;
            case BIT:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).bit());
                } else {
                    fieldReader.copyAsValue(mapWriter.bit(MappifyUtility.fieldValue));
                }
                break;
            case VARCHAR:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).varChar());
                } else {
                    fieldReader.copyAsValue(mapWriter.varChar(MappifyUtility.fieldValue));
                }
                break;
            case VARBINARY:
                if (repeated) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).varBinary());
                } else {
                    fieldReader.copyAsValue(mapWriter.varBinary(MappifyUtility.fieldValue));
                }
                break;
            case MAP:
                if (valueMajorType.getMode() == TypeProtos.DataMode.REPEATED) {
                    fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).map());
                } else {
                    fieldReader.copyAsValue(mapWriter.map(MappifyUtility.fieldValue));
                }
                break;
            case LIST:
                fieldReader.copyAsValue(mapWriter.list(MappifyUtility.fieldValue).list());
                break;
            default:
                throw new DrillRuntimeException(String.format("kvgen does not support input of type: %s", valueMinorType));
        }
    } catch (ClassCastException e) {
        final MaterializedField field = fieldReader.getField();
        throw new DrillRuntimeException(String.format(TYPE_MISMATCH_ERROR, field.getName(), field.getType()));
    }
}
Also used : MajorType(org.apache.drill.common.types.TypeProtos.MajorType) MinorType(org.apache.drill.common.types.TypeProtos.MinorType) MaterializedField(org.apache.drill.exec.record.MaterializedField) DrillRuntimeException(org.apache.drill.common.exceptions.DrillRuntimeException)

Example 48 with MinorType

use of org.apache.drill.common.types.TypeProtos.MinorType in project drill by axbaretto.

the class JdbcRecordReader method setup.

@Override
public void setup(OperatorContext operatorContext, OutputMutator output) throws ExecutionSetupException {
    try {
        connection = source.getConnection();
        statement = connection.createStatement();
        resultSet = statement.executeQuery(sql);
        final ResultSetMetaData meta = resultSet.getMetaData();
        final int columns = meta.getColumnCount();
        ImmutableList.Builder<ValueVector> vectorBuilder = ImmutableList.builder();
        ImmutableList.Builder<Copier<?>> copierBuilder = ImmutableList.builder();
        for (int i = 1; i <= columns; i++) {
            final String name = meta.getColumnLabel(i);
            final int jdbcType = meta.getColumnType(i);
            final int width = meta.getPrecision(i);
            final int scale = meta.getScale(i);
            MinorType minorType = JDBC_TYPE_MAPPINGS.get(jdbcType);
            if (minorType == null) {
                logger.warn("Ignoring column that is unsupported.", UserException.unsupportedError().message("A column you queried has a data type that is not currently supported by the JDBC storage plugin. " + "The column's name was %s and its JDBC data type was %s. ", name, nameFromType(jdbcType)).addContext("sql", sql).addContext("column Name", name).addContext("plugin", storagePluginName).build(logger));
                continue;
            }
            final MajorType type = Types.optional(minorType);
            final MaterializedField field = MaterializedField.create(name, type);
            final Class<? extends ValueVector> clazz = (Class<? extends ValueVector>) TypeHelper.getValueVectorClass(minorType, type.getMode());
            ValueVector vector = output.addField(field, clazz);
            vectorBuilder.add(vector);
            copierBuilder.add(getCopier(jdbcType, i, resultSet, vector));
        }
        vectors = vectorBuilder.build();
        copiers = copierBuilder.build();
    } catch (SQLException | SchemaChangeException e) {
        throw UserException.dataReadError(e).message("The JDBC storage plugin failed while trying setup the SQL query. ").addContext("sql", sql).addContext("plugin", storagePluginName).build(logger);
    }
}
Also used : SQLException(java.sql.SQLException) ImmutableList(com.google.common.collect.ImmutableList) MajorType(org.apache.drill.common.types.TypeProtos.MajorType) MaterializedField(org.apache.drill.exec.record.MaterializedField) ResultSetMetaData(java.sql.ResultSetMetaData) ValueVector(org.apache.drill.exec.vector.ValueVector) SchemaChangeException(org.apache.drill.exec.exception.SchemaChangeException) MinorType(org.apache.drill.common.types.TypeProtos.MinorType)

Example 49 with MinorType

use of org.apache.drill.common.types.TypeProtos.MinorType in project drill by axbaretto.

the class QueryContext method getConstantValueHolder.

@Override
public ValueHolder getConstantValueHolder(String value, MinorType type, Function<DrillBuf, ValueHolder> holderInitializer) {
    if (!constantValueHolderCache.containsKey(value)) {
        constantValueHolderCache.put(value, Maps.<MinorType, ValueHolder>newHashMap());
    }
    Map<MinorType, ValueHolder> holdersByType = constantValueHolderCache.get(value);
    ValueHolder valueHolder = holdersByType.get(type);
    if (valueHolder == null) {
        valueHolder = holderInitializer.apply(getManagedBuffer());
        holdersByType.put(type, valueHolder);
    }
    return valueHolder;
}
Also used : MinorType(org.apache.drill.common.types.TypeProtos.MinorType) ValueHolder(org.apache.drill.exec.expr.holders.ValueHolder)

Example 50 with MinorType

use of org.apache.drill.common.types.TypeProtos.MinorType in project drill by axbaretto.

the class FragmentContextImpl method getConstantValueHolder.

@Override
public ValueHolder getConstantValueHolder(String value, MinorType type, Function<DrillBuf, ValueHolder> holderInitializer) {
    if (!constantValueHolderCache.containsKey(value)) {
        constantValueHolderCache.put(value, Maps.<MinorType, ValueHolder>newHashMap());
    }
    Map<MinorType, ValueHolder> holdersByType = constantValueHolderCache.get(value);
    ValueHolder valueHolder = holdersByType.get(type);
    if (valueHolder == null) {
        valueHolder = holderInitializer.apply(getManagedBuffer());
        holdersByType.put(type, valueHolder);
    }
    return valueHolder;
}
Also used : MinorType(org.apache.drill.common.types.TypeProtos.MinorType) ValueHolder(org.apache.drill.exec.expr.holders.ValueHolder)

Aggregations

MinorType (org.apache.drill.common.types.TypeProtos.MinorType)86 MajorType (org.apache.drill.common.types.TypeProtos.MajorType)32 MaterializedField (org.apache.drill.exec.record.MaterializedField)17 ValueVector (org.apache.drill.exec.vector.ValueVector)11 DataMode (org.apache.drill.common.types.TypeProtos.DataMode)10 SchemaBuilder (org.apache.drill.exec.record.metadata.SchemaBuilder)8 TupleMetadata (org.apache.drill.exec.record.metadata.TupleMetadata)7 SubOperatorTest (org.apache.drill.test.SubOperatorTest)6 Test (org.junit.Test)6 ImmutableList (com.google.common.collect.ImmutableList)5 SchemaPath (org.apache.drill.common.expression.SchemaPath)5 ValueHolder (org.apache.drill.exec.expr.holders.ValueHolder)5 IOException (java.io.IOException)4 UserException (org.apache.drill.common.exceptions.UserException)4 OriginalType (org.apache.parquet.schema.OriginalType)4 PrimitiveType (org.apache.parquet.schema.PrimitiveType)4 SQLException (java.sql.SQLException)3 DrillRuntimeException (org.apache.drill.common.exceptions.DrillRuntimeException)3 SchemaChangeException (org.apache.drill.exec.exception.SchemaChangeException)3 ExtendableRowSet (org.apache.drill.exec.physical.rowSet.RowSet.ExtendableRowSet)3