Search in sources :

Example 1 with LegacyTypeInformationType

use of org.apache.flink.table.types.logical.LegacyTypeInformationType in project flink by apache.

the class TableSchema method validateAndCreateNameToTypeMapping.

/**
 * Creates a mapping from field name to data type, the field name can be a nested field. This is
 * mainly used for validating whether the rowtime attribute (might be nested) exists in the
 * schema. During creating, it also validates whether there is duplicate field names.
 *
 * <p>For example, a "f0" field of ROW type has two nested fields "q1" and "q2". Then the
 * mapping will be ["f0" -> ROW, "f0.q1" -> INT, "f0.q2" -> STRING].
 *
 * <pre>{@code
 * f0 ROW<q1 INT, q2 STRING>
 * }</pre>
 *
 * @param fieldNameToType Field name to type mapping that to update
 * @param fieldName Name of this field, e.g. "q1" or "q2" in the above example
 * @param fieldType Data type of this field
 * @param parentFieldName Field name of parent type, e.g. "f0" in the above example
 */
private static void validateAndCreateNameToTypeMapping(Map<String, LogicalType> fieldNameToType, String fieldName, LogicalType fieldType, String parentFieldName) {
    String fullFieldName = parentFieldName.isEmpty() ? fieldName : parentFieldName + "." + fieldName;
    LogicalType oldType = fieldNameToType.put(fullFieldName, fieldType);
    if (oldType != null) {
        throw new ValidationException("Field names must be unique. Duplicate field: '" + fullFieldName + "'");
    }
    if (isCompositeType(fieldType) && !(fieldType instanceof LegacyTypeInformationType)) {
        final List<String> fieldNames = LogicalTypeChecks.getFieldNames(fieldType);
        final List<LogicalType> fieldTypes = fieldType.getChildren();
        IntStream.range(0, fieldNames.size()).forEach(i -> validateAndCreateNameToTypeMapping(fieldNameToType, fieldNames.get(i), fieldTypes.get(i), fullFieldName));
    }
}
Also used : LogicalType(org.apache.flink.table.types.logical.LogicalType) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType)

Example 2 with LegacyTypeInformationType

use of org.apache.flink.table.types.logical.LegacyTypeInformationType in project flink by apache.

the class TypeMappingUtilsTest method testCheckPhysicalLogicalTypeCompatible.

@Test
public void testCheckPhysicalLogicalTypeCompatible() {
    TableSchema tableSchema = TableSchema.builder().field("a", DataTypes.VARCHAR(2)).field("b", DataTypes.DECIMAL(20, 2)).build();
    TableSink tableSink = new TestTableSink(tableSchema);
    LegacyTypeInformationType legacyDataType = (LegacyTypeInformationType) tableSink.getConsumedDataType().getLogicalType();
    TypeInformation legacyTypeInfo = ((TupleTypeInfo) legacyDataType.getTypeInformation()).getTypeAt(1);
    DataType physicalType = TypeConversions.fromLegacyInfoToDataType(legacyTypeInfo);
    ResolvedSchema physicSchema = DataTypeUtils.expandCompositeTypeToSchema(physicalType);
    DataType[] logicalDataTypes = tableSchema.getFieldDataTypes();
    List<DataType> physicalDataTypes = physicSchema.getColumnDataTypes();
    for (int i = 0; i < logicalDataTypes.length; i++) {
        TypeMappingUtils.checkPhysicalLogicalTypeCompatible(physicalDataTypes.get(i).getLogicalType(), logicalDataTypes[i].getLogicalType(), "physicalField", "logicalField", false);
    }
}
Also used : TableSchema(org.apache.flink.table.api.TableSchema) DataType(org.apache.flink.table.types.DataType) TableSink(org.apache.flink.table.sinks.TableSink) ResolvedSchema(org.apache.flink.table.catalog.ResolvedSchema) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType) TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation) TupleTypeInfo(org.apache.flink.api.java.typeutils.TupleTypeInfo) Test(org.junit.Test)

Example 3 with LegacyTypeInformationType

use of org.apache.flink.table.types.logical.LegacyTypeInformationType in project flink by apache.

the class DecimalDivideTypeStrategy method inferType.

@Override
public Optional<DataType> inferType(CallContext callContext) {
    final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
    final LogicalType dividend = argumentDataTypes.get(0).getLogicalType();
    final LogicalType divisor = argumentDataTypes.get(1).getLogicalType();
    // a hack to make legacy types possible until we drop them
    if (dividend instanceof LegacyTypeInformationType) {
        return Optional.of(argumentDataTypes.get(0));
    }
    if (divisor instanceof LegacyTypeInformationType) {
        return Optional.of(argumentDataTypes.get(1));
    }
    if (!isDecimalComputation(dividend, divisor)) {
        return Optional.empty();
    }
    final DecimalType decimalType = LogicalTypeMerging.findDivisionDecimalType(getPrecision(dividend), getScale(dividend), getPrecision(divisor), getScale(divisor));
    return Optional.of(fromLogicalToDataType(decimalType));
}
Also used : DataType(org.apache.flink.table.types.DataType) TypeConversions.fromLogicalToDataType(org.apache.flink.table.types.utils.TypeConversions.fromLogicalToDataType) LogicalType(org.apache.flink.table.types.logical.LogicalType) DecimalType(org.apache.flink.table.types.logical.DecimalType) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType)

Example 4 with LegacyTypeInformationType

use of org.apache.flink.table.types.logical.LegacyTypeInformationType in project flink by apache.

the class DecimalPlusTypeStrategy method inferType.

@Override
public Optional<DataType> inferType(CallContext callContext) {
    final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
    final LogicalType addend1 = argumentDataTypes.get(0).getLogicalType();
    final LogicalType addend2 = argumentDataTypes.get(1).getLogicalType();
    // a hack to make legacy types possible until we drop them
    if (addend1 instanceof LegacyTypeInformationType) {
        return Optional.of(argumentDataTypes.get(0));
    }
    if (addend2 instanceof LegacyTypeInformationType) {
        return Optional.of(argumentDataTypes.get(1));
    }
    if (!isDecimalComputation(addend1, addend2)) {
        return Optional.empty();
    }
    final DecimalType decimalType = LogicalTypeMerging.findAdditionDecimalType(getPrecision(addend1), getScale(addend1), getPrecision(addend2), getScale(addend2));
    return Optional.of(fromLogicalToDataType(decimalType));
}
Also used : DataType(org.apache.flink.table.types.DataType) TypeConversions.fromLogicalToDataType(org.apache.flink.table.types.utils.TypeConversions.fromLogicalToDataType) LogicalType(org.apache.flink.table.types.logical.LogicalType) DecimalType(org.apache.flink.table.types.logical.DecimalType) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType)

Example 5 with LegacyTypeInformationType

use of org.apache.flink.table.types.logical.LegacyTypeInformationType in project flink by apache.

the class CastInputTypeStrategy method inferInputTypes.

@Override
public Optional<List<DataType>> inferInputTypes(CallContext callContext, boolean throwOnFailure) {
    // check for type literal
    if (!callContext.isArgumentLiteral(1) || !callContext.getArgumentValue(1, DataType.class).isPresent()) {
        return Optional.empty();
    }
    final List<DataType> argumentDataTypes = callContext.getArgumentDataTypes();
    final LogicalType fromType = argumentDataTypes.get(0).getLogicalType();
    final LogicalType toType = argumentDataTypes.get(1).getLogicalType();
    // A hack to support legacy types. To be removed when we drop the legacy types.
    if (fromType instanceof LegacyTypeInformationType) {
        return Optional.of(argumentDataTypes);
    }
    if (!supportsExplicitCast(fromType, toType)) {
        if (throwOnFailure) {
            throw callContext.newValidationError("Unsupported cast from '%s' to '%s'.", fromType, toType);
        }
        return Optional.empty();
    }
    return Optional.of(argumentDataTypes);
}
Also used : DataType(org.apache.flink.table.types.DataType) LogicalType(org.apache.flink.table.types.logical.LogicalType) LegacyTypeInformationType(org.apache.flink.table.types.logical.LegacyTypeInformationType)

Aggregations

LegacyTypeInformationType (org.apache.flink.table.types.logical.LegacyTypeInformationType)15 DataType (org.apache.flink.table.types.DataType)13 LogicalType (org.apache.flink.table.types.logical.LogicalType)13 DecimalType (org.apache.flink.table.types.logical.DecimalType)6 TypeConversions.fromLogicalToDataType (org.apache.flink.table.types.utils.TypeConversions.fromLogicalToDataType)6 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)3 BigDecimalTypeInfo (org.apache.flink.table.runtime.typeutils.BigDecimalTypeInfo)2 DecimalDataTypeInfo (org.apache.flink.table.runtime.typeutils.DecimalDataTypeInfo)2 BigDecimal (java.math.BigDecimal)1 Timestamp (java.sql.Timestamp)1 LocalDateTime (java.time.LocalDateTime)1 ExecutionConfig (org.apache.flink.api.common.ExecutionConfig)1 BasicArrayTypeInfo (org.apache.flink.api.common.typeinfo.BasicArrayTypeInfo)1 CompositeType (org.apache.flink.api.common.typeutils.CompositeType)1 Tuple (org.apache.flink.api.java.tuple.Tuple)1 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)1 PojoTypeInfo (org.apache.flink.api.java.typeutils.PojoTypeInfo)1 TupleTypeInfo (org.apache.flink.api.java.typeutils.TupleTypeInfo)1 TupleTypeInfoBase (org.apache.flink.api.java.typeutils.TupleTypeInfoBase)1 TableSchema (org.apache.flink.table.api.TableSchema)1