Search in sources :

Example 26 with InvalidProgramException

use of org.apache.flink.api.common.InvalidProgramException in project flink by apache.

the class CustomPartitioningTest method testPartitionKeySelectorInvalidType.

@Test
public void testPartitionKeySelectorInvalidType() {
    try {
        final Partitioner<Integer> part = (Partitioner<Integer>) (Partitioner<?>) new TestPartitionerLong();
        final int parallelism = 4;
        ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
        env.setParallelism(parallelism);
        DataSet<Pojo> data = env.fromElements(new Pojo()).rebalance();
        try {
            data.partitionCustom(part, new TestKeySelectorInt<Pojo>());
            fail("Should throw an exception");
        } catch (InvalidProgramException e) {
        // expected
        }
    } catch (Exception e) {
        e.printStackTrace();
        fail(e.getMessage());
    }
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Partitioner(org.apache.flink.api.common.functions.Partitioner) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Test(org.junit.Test)

Example 27 with InvalidProgramException

use of org.apache.flink.api.common.InvalidProgramException in project flink by apache.

the class GroupingTupleTranslationTest method testCustomPartitioningTupleInvalidTypeSorted.

@Test
public void testCustomPartitioningTupleInvalidTypeSorted() {
    try {
        ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
        DataSet<Tuple3<Integer, Integer, Integer>> data = env.fromElements(new Tuple3<Integer, Integer, Integer>(0, 0, 0)).rebalance().setParallelism(4);
        try {
            data.groupBy(0).sortGroup(1, Order.ASCENDING).withPartitioner(new TestPartitionerLong());
            fail("Should throw an exception");
        } catch (InvalidProgramException e) {
        }
    } catch (Exception e) {
        e.printStackTrace();
        fail(e.getMessage());
    }
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Tuple3(org.apache.flink.api.java.tuple.Tuple3) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Test(org.junit.Test)

Example 28 with InvalidProgramException

use of org.apache.flink.api.common.InvalidProgramException in project flink by apache.

the class GroupingTupleTranslationTest method testCustomPartitioningTupleRejectCompositeKey.

@Test
public void testCustomPartitioningTupleRejectCompositeKey() {
    try {
        ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
        DataSet<Tuple3<Integer, Integer, Integer>> data = env.fromElements(new Tuple3<Integer, Integer, Integer>(0, 0, 0)).rebalance().setParallelism(4);
        try {
            data.groupBy(0, 1).withPartitioner(new TestPartitionerInt());
            fail("Should throw an exception");
        } catch (InvalidProgramException e) {
        }
    } catch (Exception e) {
        e.printStackTrace();
        fail(e.getMessage());
    }
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Tuple3(org.apache.flink.api.java.tuple.Tuple3) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) Test(org.junit.Test)

Example 29 with InvalidProgramException

use of org.apache.flink.api.common.InvalidProgramException in project flink by apache.

the class This0AccessFinder method clean.

private static void clean(Object func, ExecutionConfig.ClosureCleanerLevel level, boolean checkSerializable, Set<Object> visited) {
    if (func == null) {
        return;
    }
    if (!visited.add(func)) {
        return;
    }
    final Class<?> cls = func.getClass();
    if (ClassUtils.isPrimitiveOrWrapper(cls)) {
        return;
    }
    if (usesCustomSerialization(cls)) {
        return;
    }
    // First find the field name of the "this$0" field, this can
    // be "this$x" depending on the nesting
    boolean closureAccessed = false;
    for (Field f : cls.getDeclaredFields()) {
        if (f.getName().startsWith("this$")) {
            // found a closure referencing field - now try to clean
            closureAccessed |= cleanThis0(func, cls, f.getName());
        } else {
            Object fieldObject;
            try {
                f.setAccessible(true);
                fieldObject = f.get(func);
            } catch (IllegalAccessException e) {
                throw new RuntimeException(String.format("Can not access to the %s field in Class %s", f.getName(), func.getClass()));
            }
            /*
                 * we should do a deep clean when we encounter an anonymous class, inner class and local class, but should
                 * skip the class with custom serialize method.
                 *
                 * There are five kinds of classes (or interfaces):
                 * a) Top level classes
                 * b) Nested classes (static member classes)
                 * c) Inner classes (non-static member classes)
                 * d) Local classes (named classes declared within a method)
                 * e) Anonymous classes
                 */
            if (level == ExecutionConfig.ClosureCleanerLevel.RECURSIVE && needsRecursion(f, fieldObject)) {
                if (LOG.isDebugEnabled()) {
                    LOG.debug("Dig to clean the {}", fieldObject.getClass().getName());
                }
                clean(fieldObject, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true, visited);
            }
        }
    }
    if (checkSerializable) {
        try {
            InstantiationUtil.serializeObject(func);
        } catch (Exception e) {
            String functionType = getSuperClassOrInterfaceName(func.getClass());
            String msg = functionType == null ? (func + " is not serializable.") : ("The implementation of the " + functionType + " is not serializable.");
            if (closureAccessed) {
                msg += " The implementation accesses fields of its enclosing class, which is " + "a common reason for non-serializability. " + "A common solution is to make the function a proper (non-inner) class, or " + "a static inner class.";
            } else {
                msg += " The object probably contains or references non serializable fields.";
            }
            throw new InvalidProgramException(msg, e);
        }
    }
}
Also used : Field(java.lang.reflect.Field) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) IOException(java.io.IOException) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException)

Example 30 with InvalidProgramException

use of org.apache.flink.api.common.InvalidProgramException in project flink by apache.

the class IterateITCase method testCoIteration.

@Test
public void testCoIteration() throws Exception {
    int numRetries = 5;
    int timeoutScale = 1;
    for (int numRetry = 0; numRetry < numRetries; numRetry++) {
        try {
            TestSink.collected = new ArrayList<>();
            StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
            env.setParallelism(2);
            DataStream<String> otherSource = env.fromElements("1000", "2000").map(noOpStrMap).name("ParallelizeMap");
            ConnectedIterativeStreams<Integer, String> coIt = env.fromElements(0, 0).map(noOpIntMap).name("ParallelizeMap").iterate(2000 * timeoutScale).withFeedbackType(Types.STRING);
            try {
                coIt.keyBy(1, 2);
                fail();
            } catch (InvalidProgramException e) {
            // this is expected
            }
            DataStream<String> head = coIt.flatMap(new RichCoFlatMapFunction<Integer, String, String>() {

                private static final long serialVersionUID = 1L;

                boolean seenFromSource = false;

                @Override
                public void flatMap1(Integer value, Collector<String> out) throws Exception {
                    out.collect(((Integer) (value + 1)).toString());
                }

                @Override
                public void flatMap2(String value, Collector<String> out) throws Exception {
                    Integer intVal = Integer.valueOf(value);
                    if (intVal < 2) {
                        out.collect(((Integer) (intVal + 1)).toString());
                    }
                    if (intVal == 1000 || intVal == 2000) {
                        seenFromSource = true;
                    }
                }

                @Override
                public void close() {
                    assertTrue(seenFromSource);
                }
            });
            coIt.map(new CoMapFunction<Integer, String, String>() {

                @Override
                public String map1(Integer value) throws Exception {
                    return value.toString();
                }

                @Override
                public String map2(String value) throws Exception {
                    return value;
                }
            }).addSink(new ReceiveCheckNoOpSink<String>());
            coIt.closeWith(head.broadcast().union(otherSource));
            head.addSink(new TestSink()).setParallelism(1);
            assertEquals(1, env.getStreamGraph(false).getIterationSourceSinkPairs().size());
            env.execute();
            Collections.sort(TestSink.collected);
            assertEquals(Arrays.asList("1", "1", "2", "2", "2", "2"), TestSink.collected);
            // success
            break;
        } catch (Throwable t) {
            LOG.info("Run " + (numRetry + 1) + "/" + numRetries + " failed", t);
            if (numRetry >= numRetries - 1) {
                throw t;
            } else {
                timeoutScale *= 2;
            }
        }
    }
}
Also used : CoMapFunction(org.apache.flink.streaming.api.functions.co.CoMapFunction) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) InvalidProgramException(org.apache.flink.api.common.InvalidProgramException) StreamExecutionEnvironment(org.apache.flink.streaming.api.environment.StreamExecutionEnvironment) Test(org.junit.Test)

Aggregations

InvalidProgramException (org.apache.flink.api.common.InvalidProgramException)65 Test (org.junit.Test)38 ExecutionEnvironment (org.apache.flink.api.java.ExecutionEnvironment)36 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)16 ArrayList (java.util.ArrayList)12 List (java.util.List)12 Tuple3 (org.apache.flink.api.java.tuple.Tuple3)11 HashMap (java.util.HashMap)9 Map (java.util.Map)8 IOException (java.io.IOException)7 TaskInfo (org.apache.flink.api.common.TaskInfo)6 RuntimeUDFContext (org.apache.flink.api.common.functions.util.RuntimeUDFContext)6 Tuple5 (org.apache.flink.api.java.tuple.Tuple5)6 LinkedHashSet (java.util.LinkedHashSet)4 Aggregator (org.apache.flink.api.common.aggregators.Aggregator)4 ConvergenceCriterion (org.apache.flink.api.common.aggregators.ConvergenceCriterion)4 KeySelector (org.apache.flink.api.java.functions.KeySelector)4 InvalidTypesException (org.apache.flink.api.common.functions.InvalidTypesException)3 Configuration (org.apache.flink.configuration.Configuration)3 MetricGroup (org.apache.flink.metrics.MetricGroup)3