Search in sources :

Example 26 with IdentityMapper

use of org.apache.flink.optimizer.testfunctions.IdentityMapper in project flink by apache.

the class BranchingPlansCompilerTest method testBranchEachContractType.

@SuppressWarnings("unchecked")
@Test
public void testBranchEachContractType() {
    try {
        // construct the plan
        ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
        env.setParallelism(DEFAULT_PARALLELISM);
        DataSet<Long> sourceA = env.generateSequence(0, 1);
        DataSet<Long> sourceB = env.generateSequence(0, 1);
        DataSet<Long> sourceC = env.generateSequence(0, 1);
        DataSet<Long> map1 = sourceA.map(new IdentityMapper<Long>()).name("Map 1");
        DataSet<Long> reduce1 = map1.groupBy("*").reduceGroup(new IdentityGroupReducer<Long>()).name("Reduce 1");
        DataSet<Long> join1 = sourceB.union(sourceB).union(sourceC).join(sourceC).where("*").equalTo("*").with(new IdentityJoiner<Long>()).name("Join 1");
        DataSet<Long> coGroup1 = sourceA.coGroup(sourceB).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 1");
        DataSet<Long> cross1 = reduce1.cross(coGroup1).with(new IdentityCrosser<Long>()).name("Cross 1");
        DataSet<Long> coGroup2 = cross1.coGroup(cross1).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 2");
        DataSet<Long> coGroup3 = map1.coGroup(join1).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 3");
        DataSet<Long> map2 = coGroup3.map(new IdentityMapper<Long>()).name("Map 2");
        DataSet<Long> coGroup4 = map2.coGroup(join1).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 4");
        DataSet<Long> coGroup5 = coGroup2.coGroup(coGroup1).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 5");
        DataSet<Long> coGroup6 = reduce1.coGroup(coGroup4).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 6");
        DataSet<Long> coGroup7 = coGroup5.coGroup(coGroup6).where("*").equalTo("*").with(new IdentityCoGrouper<Long>()).name("CoGroup 7");
        coGroup7.union(sourceA).union(coGroup3).union(coGroup4).union(coGroup1).output(new DiscardingOutputFormat<Long>());
        Plan plan = env.createProgramPlan();
        OptimizedPlan oPlan = compileNoStats(plan);
        JobGraphGenerator jobGen = new JobGraphGenerator();
        // Compile plan to verify that no error is thrown
        jobGen.compileJobGraph(oPlan);
    } catch (Exception e) {
        e.printStackTrace();
        Assert.fail(e.getMessage());
    }
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) IdentityCrosser(org.apache.flink.optimizer.testfunctions.IdentityCrosser) Plan(org.apache.flink.api.common.Plan) OptimizedPlan(org.apache.flink.optimizer.plan.OptimizedPlan) OptimizedPlan(org.apache.flink.optimizer.plan.OptimizedPlan) IdentityMapper(org.apache.flink.optimizer.testfunctions.IdentityMapper) JobGraphGenerator(org.apache.flink.optimizer.plantranslate.JobGraphGenerator) IdentityGroupReducer(org.apache.flink.optimizer.testfunctions.IdentityGroupReducer) IdentityJoiner(org.apache.flink.optimizer.testfunctions.IdentityJoiner) IdentityCoGrouper(org.apache.flink.optimizer.testfunctions.IdentityCoGrouper) Test(org.junit.Test)

Example 27 with IdentityMapper

use of org.apache.flink.optimizer.testfunctions.IdentityMapper in project flink by apache.

the class BranchingPlansCompilerTest method testBranchAfterIteration.

@Test
public void testBranchAfterIteration() {
    ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    env.setParallelism(DEFAULT_PARALLELISM);
    DataSet<Long> sourceA = env.generateSequence(0, 1);
    IterativeDataSet<Long> loopHead = sourceA.iterate(10);
    DataSet<Long> loopTail = loopHead.map(new IdentityMapper<Long>()).name("Mapper");
    DataSet<Long> loopRes = loopHead.closeWith(loopTail);
    loopRes.output(new DiscardingOutputFormat<Long>());
    loopRes.map(new IdentityMapper<Long>()).output(new DiscardingOutputFormat<Long>());
    Plan plan = env.createProgramPlan();
    try {
        compileNoStats(plan);
    } catch (Exception e) {
        e.printStackTrace();
        Assert.fail(e.getMessage());
    }
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) IdentityMapper(org.apache.flink.optimizer.testfunctions.IdentityMapper) Plan(org.apache.flink.api.common.Plan) OptimizedPlan(org.apache.flink.optimizer.plan.OptimizedPlan) Test(org.junit.Test)

Aggregations

Plan (org.apache.flink.api.common.Plan)27 ExecutionEnvironment (org.apache.flink.api.java.ExecutionEnvironment)27 OptimizedPlan (org.apache.flink.optimizer.plan.OptimizedPlan)27 IdentityMapper (org.apache.flink.optimizer.testfunctions.IdentityMapper)27 Test (org.junit.Test)27 SinkPlanNode (org.apache.flink.optimizer.plan.SinkPlanNode)16 SingleInputPlanNode (org.apache.flink.optimizer.plan.SingleInputPlanNode)15 IdentityGroupReducer (org.apache.flink.optimizer.testfunctions.IdentityGroupReducer)9 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)7 JobGraphGenerator (org.apache.flink.optimizer.plantranslate.JobGraphGenerator)7 ShipStrategyType (org.apache.flink.runtime.operators.shipping.ShipStrategyType)4 DiscardingOutputFormat (org.apache.flink.api.java.io.DiscardingOutputFormat)3 Tuple3 (org.apache.flink.api.java.tuple.Tuple3)3 DualInputPlanNode (org.apache.flink.optimizer.plan.DualInputPlanNode)3 NAryUnionPlanNode (org.apache.flink.optimizer.plan.NAryUnionPlanNode)3 IdentityGroupReducerCombinable (org.apache.flink.optimizer.testfunctions.IdentityGroupReducerCombinable)3 InvalidProgramException (org.apache.flink.api.common.InvalidProgramException)2 DataSet (org.apache.flink.api.java.DataSet)2 IdentityCrosser (org.apache.flink.optimizer.testfunctions.IdentityCrosser)2 Assert (org.junit.Assert)2