Search in sources :

Example 66 with ExecutionEnvironment

use of org.apache.flink.api.java.ExecutionEnvironment in project flink by apache.

the class ReduceITCase method testAllReduceForCustomTypes.

@Test
public void testAllReduceForCustomTypes() throws Exception {
    /*
		 * All-reduce for custom types
		 */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    DataSet<CustomType> ds = CollectionDataSets.getCustomTypeDataSet(env);
    DataSet<CustomType> reduceDs = ds.reduce(new AllAddingCustomTypeReduce());
    List<CustomType> result = reduceDs.collect();
    String expected = "91,210,Hello!";
    compareResultAsText(result, expected);
}
Also used : CustomType(org.apache.flink.test.javaApiOperators.util.CollectionDataSets.CustomType) ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) Test(org.junit.Test)

Example 67 with ExecutionEnvironment

use of org.apache.flink.api.java.ExecutionEnvironment in project flink by apache.

the class SampleITCase method verifySamplerWithFraction.

private void verifySamplerWithFraction(boolean withReplacement, double fraction, long seed) throws Exception {
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    FlatMapOperator<Tuple3<Integer, Long, String>, String> ds = getSourceDataSet(env);
    MapPartitionOperator<String, String> sampled = DataSetUtils.sample(ds, withReplacement, fraction, seed);
    List<String> result = sampled.collect();
    containsResultAsText(result, getSourceStrings());
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) Tuple3(org.apache.flink.api.java.tuple.Tuple3)

Example 68 with ExecutionEnvironment

use of org.apache.flink.api.java.ExecutionEnvironment in project flink by apache.

the class SampleITCase method verifySamplerWithFixedSize.

private void verifySamplerWithFixedSize(boolean withReplacement, int numSamples, long seed) throws Exception {
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    FlatMapOperator<Tuple3<Integer, Long, String>, String> ds = getSourceDataSet(env);
    DataSet<String> sampled = DataSetUtils.sampleWithSize(ds, withReplacement, numSamples, seed);
    List<String> result = sampled.collect();
    assertEquals(numSamples, result.size());
    containsResultAsText(result, getSourceStrings());
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) Tuple3(org.apache.flink.api.java.tuple.Tuple3)

Example 69 with ExecutionEnvironment

use of org.apache.flink.api.java.ExecutionEnvironment in project flink by apache.

the class SortPartitionITCase method testSortPartitionByFieldExpression.

@SuppressWarnings({ "rawtypes", "unchecked" })
@Test
public void testSortPartitionByFieldExpression() throws Exception {
    /*
		 * Test sort partition on field expression
		 */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    env.setParallelism(4);
    DataSet<Tuple3<Integer, Long, String>> ds = CollectionDataSets.get3TupleDataSet(env);
    List<Tuple1<Boolean>> result = ds.map(new IdMapper()).setParallelism(// parallelize input
    4).sortPartition("f1", Order.DESCENDING).mapPartition(new OrderCheckMapper<>(new Tuple3Checker())).distinct().collect();
    String expected = "(true)\n";
    compareResultAsText(result, expected);
}
Also used : ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) Tuple1(org.apache.flink.api.java.tuple.Tuple1) Tuple3(org.apache.flink.api.java.tuple.Tuple3) Test(org.junit.Test)

Example 70 with ExecutionEnvironment

use of org.apache.flink.api.java.ExecutionEnvironment in project flink by apache.

the class SortPartitionITCase method testSortPartitionPojoByNestedFieldExpression.

@Test
public void testSortPartitionPojoByNestedFieldExpression() throws Exception {
    /*
		 * Test sort partition on field expression
		 */
    final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
    env.setParallelism(3);
    DataSet<POJO> ds = CollectionDataSets.getMixedPojoDataSet(env);
    List<Tuple1<Boolean>> result = ds.map(new IdMapper<POJO>()).setParallelism(// parallelize input
    1).sortPartition("nestedTupleWithCustom.f1.myString", Order.ASCENDING).sortPartition("number", Order.DESCENDING).mapPartition(new OrderCheckMapper<>(new PojoChecker())).distinct().collect();
    String expected = "(true)\n";
    compareResultAsText(result, expected);
}
Also used : POJO(org.apache.flink.test.javaApiOperators.util.CollectionDataSets.POJO) ExecutionEnvironment(org.apache.flink.api.java.ExecutionEnvironment) Tuple1(org.apache.flink.api.java.tuple.Tuple1) Test(org.junit.Test)

Aggregations

ExecutionEnvironment (org.apache.flink.api.java.ExecutionEnvironment)1247 Test (org.junit.Test)1090 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)374 Tuple3 (org.apache.flink.api.java.tuple.Tuple3)264 Plan (org.apache.flink.api.common.Plan)238 Tuple5 (org.apache.flink.api.java.tuple.Tuple5)236 OptimizedPlan (org.apache.flink.optimizer.plan.OptimizedPlan)199 SinkPlanNode (org.apache.flink.optimizer.plan.SinkPlanNode)139 InvalidProgramException (org.apache.flink.api.common.InvalidProgramException)138 Vertex (org.apache.flink.graph.Vertex)93 SingleInputPlanNode (org.apache.flink.optimizer.plan.SingleInputPlanNode)73 Edge (org.apache.flink.graph.Edge)70 DualInputPlanNode (org.apache.flink.optimizer.plan.DualInputPlanNode)66 ArrayList (java.util.ArrayList)57 Tuple1 (org.apache.flink.api.java.tuple.Tuple1)49 SourcePlanNode (org.apache.flink.optimizer.plan.SourcePlanNode)44 DiscardingOutputFormat (org.apache.flink.api.java.io.DiscardingOutputFormat)39 BatchTableEnvironment (org.apache.flink.table.api.java.BatchTableEnvironment)38 FieldSet (org.apache.flink.api.common.operators.util.FieldSet)37 JobGraphGenerator (org.apache.flink.optimizer.plantranslate.JobGraphGenerator)35