Search in sources :

Example 6 with InvalidTypesException

use of org.apache.flink.api.common.functions.InvalidTypesException in project flink by apache.

the class ExistingSavepoint method readKeyedState.

/**
 * Read keyed state from an operator in a {@code Savepoint}.
 *
 * @param uid The uid of the operator.
 * @param function The {@link KeyedStateReaderFunction} that is called for each key in state.
 * @param <K> The type of the key in state.
 * @param <OUT> The output type of the transform function.
 * @return A {@code DataSet} of objects read from keyed state.
 * @throws IOException If the savepoint does not contain operator state with the given uid.
 */
public <K, OUT> DataSource<OUT> readKeyedState(String uid, KeyedStateReaderFunction<K, OUT> function) throws IOException {
    TypeInformation<K> keyTypeInfo;
    TypeInformation<OUT> outType;
    try {
        keyTypeInfo = TypeExtractor.createTypeInfo(KeyedStateReaderFunction.class, function.getClass(), 0, null, null);
    } catch (InvalidTypesException e) {
        throw new InvalidProgramException("The key type of the KeyedStateReaderFunction could not be automatically determined. Please use " + "Savepoint#readKeyedState(String, KeyedStateReaderFunction, TypeInformation, TypeInformation) instead.", e);
    }
    try {
        outType = TypeExtractor.getUnaryOperatorReturnType(function, KeyedStateReaderFunction.class, 0, 1, TypeExtractor.NO_INDEX, keyTypeInfo, Utils.getCallLocationName(), false);
    } catch (InvalidTypesException e) {
        throw new InvalidProgramException("The output type of the KeyedStateReaderFunction could not be automatically determined. Please use " + "Savepoint#readKeyedState(String, KeyedStateReaderFunction, TypeInformation, TypeInformation) instead.", e);
    }
    return readKeyedState(uid, function, keyTypeInfo, outType);
}
Also used : InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) KeyedStateReaderFunction(org.apache.flink.state.api.functions.KeyedStateReaderFunction) InvalidTypesException(org.apache.flink.api.common.functions.InvalidTypesException)

Example 7 with InvalidTypesException

use of org.apache.flink.api.common.functions.InvalidTypesException in project flink by apache.

the class SavepointReader method readKeyedState.

/**
 * Read keyed state from an operator in a {@code Savepoint}.
 *
 * @param uid The uid of the operator.
 * @param function The {@link KeyedStateReaderFunction} that is called for each key in state.
 * @param <K> The type of the key in state.
 * @param <OUT> The output type of the transform function.
 * @return A {@code DataStream} of objects read from keyed state.
 * @throws IOException If the savepoint does not contain operator state with the given uid.
 */
public <K, OUT> DataStream<OUT> readKeyedState(String uid, KeyedStateReaderFunction<K, OUT> function) throws IOException {
    TypeInformation<K> keyTypeInfo;
    TypeInformation<OUT> outType;
    try {
        keyTypeInfo = TypeExtractor.createTypeInfo(KeyedStateReaderFunction.class, function.getClass(), 0, null, null);
    } catch (InvalidTypesException e) {
        throw new InvalidProgramException("The key type of the KeyedStateReaderFunction could not be automatically determined. Please use " + "Savepoint#readKeyedState(String, KeyedStateReaderFunction, TypeInformation, TypeInformation) instead.", e);
    }
    try {
        outType = TypeExtractor.getUnaryOperatorReturnType(function, KeyedStateReaderFunction.class, 0, 1, TypeExtractor.NO_INDEX, keyTypeInfo, Utils.getCallLocationName(), false);
    } catch (InvalidTypesException e) {
        throw new InvalidProgramException("The output type of the KeyedStateReaderFunction could not be automatically determined. Please use " + "Savepoint#readKeyedState(String, KeyedStateReaderFunction, TypeInformation, TypeInformation) instead.", e);
    }
    return readKeyedState(uid, function, keyTypeInfo, outType);
}
Also used : InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) KeyedStateReaderFunction(org.apache.flink.state.api.functions.KeyedStateReaderFunction) InvalidTypesException(org.apache.flink.api.common.functions.InvalidTypesException)

Example 8 with InvalidTypesException

use of org.apache.flink.api.common.functions.InvalidTypesException in project flink by apache.

the class Types method POJO.

/**
 * Returns type information for a POJO (Plain Old Java Object) and allows to specify all fields
 * manually.
 *
 * <p>A type is considered a FLink POJO type, if it fulfills the conditions below.
 *
 * <ul>
 *   <li>It is a public class, and standalone (not a non-static inner class)
 *   <li>It has a public no-argument constructor.
 *   <li>All non-static, non-transient fields in the class (and all superclasses) are either
 *       public (and non-final) or have a public getter and a setter method that follows the
 *       Java beans naming conventions for getters and setters.
 *   <li>It is a fixed-length, null-aware composite type with non-deterministic field order.
 *       Every field can be null independent of the field's type.
 * </ul>
 *
 * <p>The generic types for all fields of the POJO can be defined in a hierarchy of subclasses.
 *
 * <p>If Flink's type analyzer is unable to extract a POJO field, an {@link
 * org.apache.flink.api.common.functions.InvalidTypesException} is thrown.
 *
 * <p><strong>Note:</strong> In most cases the type information of fields can be determined
 * automatically, we recommend to use {@link Types#POJO(Class)}.
 *
 * @param pojoClass POJO class
 * @param fields map of fields that map a name to type information. The map key is the name of
 *     the field and the value is its type.
 */
public static <T> TypeInformation<T> POJO(Class<T> pojoClass, Map<String, TypeInformation<?>> fields) {
    final List<PojoField> pojoFields = new ArrayList<>(fields.size());
    for (Map.Entry<String, TypeInformation<?>> field : fields.entrySet()) {
        final Field f = TypeExtractor.getDeclaredField(pojoClass, field.getKey());
        if (f == null) {
            throw new InvalidTypesException("Field '" + field.getKey() + "' could not be accessed.");
        }
        pojoFields.add(new PojoField(f, field.getValue()));
    }
    return new PojoTypeInfo<>(pojoClass, pojoFields);
}
Also used : PojoField(org.apache.flink.api.java.typeutils.PojoField) Field(java.lang.reflect.Field) ArrayList(java.util.ArrayList) PojoField(org.apache.flink.api.java.typeutils.PojoField) PojoTypeInfo(org.apache.flink.api.java.typeutils.PojoTypeInfo) InvalidTypesException(org.apache.flink.api.common.functions.InvalidTypesException) HashMap(java.util.HashMap) Map(java.util.Map)

Example 9 with InvalidTypesException

use of org.apache.flink.api.common.functions.InvalidTypesException in project flink by apache.

the class TypeExtractorTest method testFunctionWithMissingGenerics.

@SuppressWarnings({ "unchecked", "rawtypes" })
@Test
public void testFunctionWithMissingGenerics() {
    RichMapFunction function = new RichMapFunction() {

        private static final long serialVersionUID = 1L;

        @Override
        public String map(Object value) throws Exception {
            return null;
        }
    };
    TypeInformation<?> ti = TypeExtractor.getMapReturnTypes(function, Types.STRING, "name", true);
    Assert.assertTrue(ti instanceof MissingTypeInfo);
    try {
        TypeExtractor.getMapReturnTypes(function, Types.STRING);
        Assert.fail("Expected an exception");
    } catch (InvalidTypesException e) {
    // expected
    }
}
Also used : RichMapFunction(org.apache.flink.api.common.functions.RichMapFunction) InvalidTypesException(org.apache.flink.api.common.functions.InvalidTypesException) Test(org.junit.Test)

Example 10 with InvalidTypesException

use of org.apache.flink.api.common.functions.InvalidTypesException in project flink by apache.

the class TypeExtractor method privateGetForClass.

@SuppressWarnings({ "unchecked", "rawtypes" })
private <OUT, IN1, IN2> TypeInformation<OUT> privateGetForClass(Class<OUT> clazz, List<Type> typeHierarchy, ParameterizedType parameterizedType, TypeInformation<IN1> in1Type, TypeInformation<IN2> in2Type) {
    checkNotNull(clazz);
    // check if type information can be produced using a factory
    final TypeInformation<OUT> typeFromFactory = createTypeInfoFromFactory(clazz, typeHierarchy, in1Type, in2Type);
    if (typeFromFactory != null) {
        return typeFromFactory;
    }
    // Object is handled as generic type info
    if (clazz.equals(Object.class)) {
        return new GenericTypeInfo<>(clazz);
    }
    // Class is handled as generic type info
    if (clazz.equals(Class.class)) {
        return new GenericTypeInfo<>(clazz);
    }
    // recursive types are handled as generic type info
    if (countTypeInHierarchy(typeHierarchy, clazz) > 1) {
        return new GenericTypeInfo<>(clazz);
    }
    // check for arrays
    if (clazz.isArray()) {
        // primitive arrays: int[], byte[], ...
        PrimitiveArrayTypeInfo<OUT> primitiveArrayInfo = PrimitiveArrayTypeInfo.getInfoFor(clazz);
        if (primitiveArrayInfo != null) {
            return primitiveArrayInfo;
        }
        // basic type arrays: String[], Integer[], Double[]
        BasicArrayTypeInfo<OUT, ?> basicArrayInfo = BasicArrayTypeInfo.getInfoFor(clazz);
        if (basicArrayInfo != null) {
            return basicArrayInfo;
        } else // object arrays
        {
            TypeInformation<?> componentTypeInfo = createTypeInfoWithTypeHierarchy(typeHierarchy, clazz.getComponentType(), in1Type, in2Type);
            return ObjectArrayTypeInfo.getInfoFor(clazz, componentTypeInfo);
        }
    }
    // check for writable types
    if (isHadoopWritable(clazz)) {
        return createHadoopWritableTypeInfo(clazz);
    }
    // check for basic types
    TypeInformation<OUT> basicTypeInfo = BasicTypeInfo.getInfoFor(clazz);
    if (basicTypeInfo != null) {
        return basicTypeInfo;
    }
    // check for SQL time types
    TypeInformation<OUT> timeTypeInfo = SqlTimeTypeInfo.getInfoFor(clazz);
    if (timeTypeInfo != null) {
        return timeTypeInfo;
    }
    // check for subclasses of Value
    if (Value.class.isAssignableFrom(clazz)) {
        Class<? extends Value> valueClass = clazz.asSubclass(Value.class);
        return (TypeInformation<OUT>) ValueTypeInfo.getValueTypeInfo(valueClass);
    }
    // check for subclasses of Tuple
    if (Tuple.class.isAssignableFrom(clazz)) {
        if (clazz == Tuple0.class) {
            return new TupleTypeInfo(Tuple0.class);
        }
        throw new InvalidTypesException("Type information extraction for tuples (except Tuple0) cannot be done based on the class.");
    }
    // check for Enums
    if (Enum.class.isAssignableFrom(clazz)) {
        return new EnumTypeInfo(clazz);
    }
    // special case for POJOs generated by Avro.
    if (hasSuperclass(clazz, AVRO_SPECIFIC_RECORD_BASE_CLASS)) {
        return AvroUtils.getAvroUtils().createAvroTypeInfo(clazz);
    }
    if (Modifier.isInterface(clazz.getModifiers())) {
        // Interface has no members and is therefore not handled as POJO
        return new GenericTypeInfo<>(clazz);
    }
    try {
        Type t = parameterizedType != null ? parameterizedType : clazz;
        TypeInformation<OUT> pojoType = analyzePojo(t, new ArrayList<>(typeHierarchy), in1Type, in2Type);
        if (pojoType != null) {
            return pojoType;
        }
    } catch (InvalidTypesException e) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("Unable to handle type " + clazz + " as POJO. Message: " + e.getMessage(), e);
        }
    // ignore and create generic type info
    }
    // return a generic type
    return new GenericTypeInfo<>(clazz);
}
Also used : TypeInformation(org.apache.flink.api.common.typeinfo.TypeInformation) GenericArrayType(java.lang.reflect.GenericArrayType) TypeExtractionUtils.isClassType(org.apache.flink.api.java.typeutils.TypeExtractionUtils.isClassType) Type(java.lang.reflect.Type) CompositeType(org.apache.flink.api.common.typeutils.CompositeType) ParameterizedType(java.lang.reflect.ParameterizedType) InvalidTypesException(org.apache.flink.api.common.functions.InvalidTypesException)

Aggregations

InvalidTypesException (org.apache.flink.api.common.functions.InvalidTypesException)31 GenericArrayType (java.lang.reflect.GenericArrayType)19 ParameterizedType (java.lang.reflect.ParameterizedType)19 Type (java.lang.reflect.Type)19 CompositeType (org.apache.flink.api.common.typeutils.CompositeType)19 TypeExtractionUtils.isClassType (org.apache.flink.api.java.typeutils.TypeExtractionUtils.isClassType)19 ArrayList (java.util.ArrayList)12 TypeInformation (org.apache.flink.api.common.typeinfo.TypeInformation)12 Field (java.lang.reflect.Field)6 TypeVariable (java.lang.reflect.TypeVariable)6 PrimitiveArrayTypeInfo (org.apache.flink.api.common.typeinfo.PrimitiveArrayTypeInfo)6 Method (java.lang.reflect.Method)5 PublicEvolving (org.apache.flink.annotation.PublicEvolving)5 BasicArrayTypeInfo (org.apache.flink.api.common.typeinfo.BasicArrayTypeInfo)5 Tuple (org.apache.flink.api.java.tuple.Tuple)5 TypeExtractionUtils.typeToClass (org.apache.flink.api.java.typeutils.TypeExtractionUtils.typeToClass)5 BasicTypeInfo (org.apache.flink.api.common.typeinfo.BasicTypeInfo)4 LambdaExecutable (org.apache.flink.api.java.typeutils.TypeExtractionUtils.LambdaExecutable)4 SqlTimeTypeInfo (org.apache.flink.api.common.typeinfo.SqlTimeTypeInfo)3 Constructor (java.lang.reflect.Constructor)2