Search in sources :

Example 26 with KeyedOneInputStreamOperatorTestHarness

use of org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness in project flink by apache.

the class EvictingWindowOperatorTest method testDeltaEvictorEvictAfter.

/**
	 * Tests DeltaEvictor, evictAfter behavior
	 * @throws Exception
	 */
@Test
public void testDeltaEvictorEvictAfter() throws Exception {
    AtomicInteger closeCalled = new AtomicInteger(0);
    final int TRIGGER_COUNT = 2;
    final boolean EVICT_AFTER = true;
    final int THRESHOLD = 2;
    TypeInformation<Tuple2<String, Integer>> inputType = TypeInfoParser.parse("Tuple2<String, Integer>");
    @SuppressWarnings({ "unchecked", "rawtypes" }) TypeSerializer<StreamRecord<Tuple2<String, Integer>>> streamRecordSerializer = (TypeSerializer<StreamRecord<Tuple2<String, Integer>>>) new StreamElementSerializer(inputType.createSerializer(new ExecutionConfig()));
    ListStateDescriptor<StreamRecord<Tuple2<String, Integer>>> stateDesc = new ListStateDescriptor<>("window-contents", streamRecordSerializer);
    EvictingWindowOperator<String, Tuple2<String, Integer>, Tuple2<String, Integer>, GlobalWindow> operator = new EvictingWindowOperator<>(GlobalWindows.create(), new GlobalWindow.Serializer(), new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO.createSerializer(new ExecutionConfig()), stateDesc, new InternalIterableWindowFunction<>(new RichSumReducer<GlobalWindow>(closeCalled)), CountTrigger.of(TRIGGER_COUNT), DeltaEvictor.of(THRESHOLD, new DeltaFunction<Tuple2<String, Integer>>() {

        @Override
        public double getDelta(Tuple2<String, Integer> oldDataPoint, Tuple2<String, Integer> newDataPoint) {
            return newDataPoint.f1 - oldDataPoint.f1;
        }
    }, EVICT_AFTER), 0, null);
    OneInputStreamOperatorTestHarness<Tuple2<String, Integer>, Tuple2<String, Integer>> testHarness = new KeyedOneInputStreamOperatorTestHarness<>(operator, new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO);
    long initialTime = 0L;
    ConcurrentLinkedQueue<Object> expectedOutput = new ConcurrentLinkedQueue<>();
    testHarness.open();
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1), initialTime + 3000));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 4), initialTime + 3999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1), initialTime + 20));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1), initialTime));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 5), initialTime + 999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 5), initialTime + 1998));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 6), initialTime + 1999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1), initialTime + 1000));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 5), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 15), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 2), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 9), initialTime + 10999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 10), initialTime + 1000));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 16), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 22), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.close();
    Assert.assertEquals("Close was not called.", 1, closeCalled.get());
}
Also used : ListStateDescriptor(org.apache.flink.api.common.state.ListStateDescriptor) ExecutionConfig(org.apache.flink.api.common.ExecutionConfig) KeyedOneInputStreamOperatorTestHarness(org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness) TypeSerializer(org.apache.flink.api.common.typeutils.TypeSerializer) StreamElementSerializer(org.apache.flink.streaming.runtime.streamrecord.StreamElementSerializer) StreamRecord(org.apache.flink.streaming.runtime.streamrecord.StreamRecord) DeltaFunction(org.apache.flink.streaming.api.functions.windowing.delta.DeltaFunction) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) Tuple2(org.apache.flink.api.java.tuple.Tuple2) ConcurrentLinkedQueue(java.util.concurrent.ConcurrentLinkedQueue) GlobalWindow(org.apache.flink.streaming.api.windowing.windows.GlobalWindow) Test(org.junit.Test)

Example 27 with KeyedOneInputStreamOperatorTestHarness

use of org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness in project flink by apache.

the class EvictingWindowOperatorTest method testTimeEvictorNoTimestamp.

/**
	 * Tests time evictor, if no timestamp information in the StreamRecord
	 * No element will be evicted from the window
	 * @throws Exception
	 */
@Test
public void testTimeEvictorNoTimestamp() throws Exception {
    AtomicInteger closeCalled = new AtomicInteger(0);
    final int TRIGGER_COUNT = 2;
    final boolean EVICT_AFTER = true;
    TypeInformation<Tuple2<String, Integer>> inputType = TypeInfoParser.parse("Tuple2<String, Integer>");
    @SuppressWarnings({ "unchecked", "rawtypes" }) TypeSerializer<StreamRecord<Tuple2<String, Integer>>> streamRecordSerializer = (TypeSerializer<StreamRecord<Tuple2<String, Integer>>>) new StreamElementSerializer(inputType.createSerializer(new ExecutionConfig()));
    ListStateDescriptor<StreamRecord<Tuple2<String, Integer>>> stateDesc = new ListStateDescriptor<>("window-contents", streamRecordSerializer);
    EvictingWindowOperator<String, Tuple2<String, Integer>, Tuple2<String, Integer>, GlobalWindow> operator = new EvictingWindowOperator<>(GlobalWindows.create(), new GlobalWindow.Serializer(), new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO.createSerializer(new ExecutionConfig()), stateDesc, new InternalIterableWindowFunction<>(new RichSumReducer<GlobalWindow>(closeCalled)), CountTrigger.of(TRIGGER_COUNT), TimeEvictor.of(Time.seconds(2), EVICT_AFTER), 0, null);
    OneInputStreamOperatorTestHarness<Tuple2<String, Integer>, Tuple2<String, Integer>> testHarness = new KeyedOneInputStreamOperatorTestHarness<>(operator, new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO);
    ConcurrentLinkedQueue<Object> expectedOutput = new ConcurrentLinkedQueue<>();
    testHarness.open();
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 2), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 2), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 4), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1)));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1)));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 4), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 6), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.close();
    Assert.assertEquals("Close was not called.", 1, closeCalled.get());
}
Also used : ListStateDescriptor(org.apache.flink.api.common.state.ListStateDescriptor) ExecutionConfig(org.apache.flink.api.common.ExecutionConfig) KeyedOneInputStreamOperatorTestHarness(org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness) TypeSerializer(org.apache.flink.api.common.typeutils.TypeSerializer) StreamElementSerializer(org.apache.flink.streaming.runtime.streamrecord.StreamElementSerializer) StreamRecord(org.apache.flink.streaming.runtime.streamrecord.StreamRecord) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) Tuple2(org.apache.flink.api.java.tuple.Tuple2) ConcurrentLinkedQueue(java.util.concurrent.ConcurrentLinkedQueue) GlobalWindow(org.apache.flink.streaming.api.windowing.windows.GlobalWindow) Test(org.junit.Test)

Example 28 with KeyedOneInputStreamOperatorTestHarness

use of org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness in project flink by apache.

the class EvictingWindowOperatorTest method testDeltaEvictorEvictBefore.

/**
	 * Tests DeltaEvictor, evictBefore behavior
	 * @throws Exception
	 */
@Test
public void testDeltaEvictorEvictBefore() throws Exception {
    AtomicInteger closeCalled = new AtomicInteger(0);
    final int TRIGGER_COUNT = 2;
    final boolean EVICT_AFTER = false;
    final int THRESHOLD = 2;
    TypeInformation<Tuple2<String, Integer>> inputType = TypeInfoParser.parse("Tuple2<String, Integer>");
    @SuppressWarnings({ "unchecked", "rawtypes" }) TypeSerializer<StreamRecord<Tuple2<String, Integer>>> streamRecordSerializer = (TypeSerializer<StreamRecord<Tuple2<String, Integer>>>) new StreamElementSerializer(inputType.createSerializer(new ExecutionConfig()));
    ListStateDescriptor<StreamRecord<Tuple2<String, Integer>>> stateDesc = new ListStateDescriptor<>("window-contents", streamRecordSerializer);
    EvictingWindowOperator<String, Tuple2<String, Integer>, Tuple2<String, Integer>, GlobalWindow> operator = new EvictingWindowOperator<>(GlobalWindows.create(), new GlobalWindow.Serializer(), new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO.createSerializer(new ExecutionConfig()), stateDesc, new InternalIterableWindowFunction<>(new RichSumReducer<GlobalWindow>(closeCalled)), CountTrigger.of(TRIGGER_COUNT), DeltaEvictor.of(THRESHOLD, new DeltaFunction<Tuple2<String, Integer>>() {

        @Override
        public double getDelta(Tuple2<String, Integer> oldDataPoint, Tuple2<String, Integer> newDataPoint) {
            return newDataPoint.f1 - oldDataPoint.f1;
        }
    }, EVICT_AFTER), 0, null);
    OneInputStreamOperatorTestHarness<Tuple2<String, Integer>, Tuple2<String, Integer>> testHarness = new KeyedOneInputStreamOperatorTestHarness<>(operator, new TupleKeySelector(), BasicTypeInfo.STRING_TYPE_INFO);
    long initialTime = 0L;
    ConcurrentLinkedQueue<Object> expectedOutput = new ConcurrentLinkedQueue<>();
    testHarness.open();
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1), initialTime + 3000));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 4), initialTime + 3999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1), initialTime + 20));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 1), initialTime));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 5), initialTime + 999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 5), initialTime + 1998));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 6), initialTime + 1999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 1), initialTime + 1000));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 4), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 11), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 2), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key1", 3), initialTime + 10999));
    testHarness.processElement(new StreamRecord<>(new Tuple2<>("key2", 10), initialTime + 1000));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key1", 8), Long.MAX_VALUE));
    expectedOutput.add(new StreamRecord<>(new Tuple2<>("key2", 10), Long.MAX_VALUE));
    TestHarnessUtil.assertOutputEqualsSorted("Output was not correct.", expectedOutput, testHarness.getOutput(), new ResultSortComparator());
    testHarness.close();
    Assert.assertEquals("Close was not called.", 1, closeCalled.get());
}
Also used : ListStateDescriptor(org.apache.flink.api.common.state.ListStateDescriptor) ExecutionConfig(org.apache.flink.api.common.ExecutionConfig) KeyedOneInputStreamOperatorTestHarness(org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness) TypeSerializer(org.apache.flink.api.common.typeutils.TypeSerializer) StreamElementSerializer(org.apache.flink.streaming.runtime.streamrecord.StreamElementSerializer) StreamRecord(org.apache.flink.streaming.runtime.streamrecord.StreamRecord) DeltaFunction(org.apache.flink.streaming.api.functions.windowing.delta.DeltaFunction) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) AtomicInteger(java.util.concurrent.atomic.AtomicInteger) Tuple2(org.apache.flink.api.java.tuple.Tuple2) ConcurrentLinkedQueue(java.util.concurrent.ConcurrentLinkedQueue) GlobalWindow(org.apache.flink.streaming.api.windowing.windows.GlobalWindow) Test(org.junit.Test)

Example 29 with KeyedOneInputStreamOperatorTestHarness

use of org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness in project flink by apache.

the class CEPOperatorTest method testKeyedAdvancingTimeWithoutElements.

/**
	 * Tests that the internal time of a CEP operator advances only given watermarks. See FLINK-5033
	 */
@Test
public void testKeyedAdvancingTimeWithoutElements() throws Exception {
    final KeySelector<Event, Integer> keySelector = new TestKeySelector();
    final Event startEvent = new Event(42, "start", 1.0);
    final long watermarkTimestamp1 = 5L;
    final long watermarkTimestamp2 = 13L;
    final Map<String, Event> expectedSequence = new HashMap<>(2);
    expectedSequence.put("start", startEvent);
    OneInputStreamOperatorTestHarness<Event, Either<Tuple2<Map<String, Event>, Long>, Map<String, Event>>> harness = new KeyedOneInputStreamOperatorTestHarness<>(new TimeoutKeyedCEPPatternOperator<>(Event.createTypeSerializer(), false, keySelector, IntSerializer.INSTANCE, new NFAFactory(true), true), keySelector, BasicTypeInfo.INT_TYPE_INFO);
    try {
        harness.setup(new KryoSerializer<>((Class<Either<Tuple2<Map<String, Event>, Long>, Map<String, Event>>>) (Object) Either.class, new ExecutionConfig()));
        harness.open();
        harness.processElement(new StreamRecord<>(startEvent, 3L));
        harness.processWatermark(new Watermark(watermarkTimestamp1));
        harness.processWatermark(new Watermark(watermarkTimestamp2));
        Queue<Object> result = harness.getOutput();
        assertEquals(3L, result.size());
        Object watermark1 = result.poll();
        assertTrue(watermark1 instanceof Watermark);
        assertEquals(watermarkTimestamp1, ((Watermark) watermark1).getTimestamp());
        Object resultObject = result.poll();
        assertTrue(resultObject instanceof StreamRecord);
        StreamRecord<Either<Tuple2<Map<String, Event>, Long>, Map<String, Event>>> streamRecord = (StreamRecord<Either<Tuple2<Map<String, Event>, Long>, Map<String, Event>>>) resultObject;
        assertTrue(streamRecord.getValue() instanceof Either.Left);
        Either.Left<Tuple2<Map<String, Event>, Long>, Map<String, Event>> left = (Either.Left<Tuple2<Map<String, Event>, Long>, Map<String, Event>>) streamRecord.getValue();
        Tuple2<Map<String, Event>, Long> leftResult = left.left();
        assertEquals(watermarkTimestamp2, (long) leftResult.f1);
        assertEquals(expectedSequence, leftResult.f0);
        Object watermark2 = result.poll();
        assertTrue(watermark2 instanceof Watermark);
        assertEquals(watermarkTimestamp2, ((Watermark) watermark2).getTimestamp());
    } finally {
        harness.close();
    }
}
Also used : HashMap(java.util.HashMap) ExecutionConfig(org.apache.flink.api.common.ExecutionConfig) KeyedOneInputStreamOperatorTestHarness(org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness) Either(org.apache.flink.types.Either) StreamRecord(org.apache.flink.streaming.runtime.streamrecord.StreamRecord) Tuple2(org.apache.flink.api.java.tuple.Tuple2) Event(org.apache.flink.cep.Event) SubEvent(org.apache.flink.cep.SubEvent) HashMap(java.util.HashMap) Map(java.util.Map) Watermark(org.apache.flink.streaming.api.watermark.Watermark) Test(org.junit.Test)

Example 30 with KeyedOneInputStreamOperatorTestHarness

use of org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness in project beam by apache.

the class DoFnOperatorTest method testLateDroppingForStatefulFn.

@Test
public void testLateDroppingForStatefulFn() throws Exception {
    WindowingStrategy<Object, IntervalWindow> windowingStrategy = WindowingStrategy.of(FixedWindows.of(new Duration(10)));
    DoFn<Integer, String> fn = new DoFn<Integer, String>() {

        @StateId("state")
        private final StateSpec<ValueState<String>> stateSpec = StateSpecs.value(StringUtf8Coder.of());

        @ProcessElement
        public void processElement(ProcessContext context) {
            context.output(context.element().toString());
        }
    };
    WindowedValue.FullWindowedValueCoder<Integer> windowedValueCoder = WindowedValue.getFullCoder(VarIntCoder.of(), windowingStrategy.getWindowFn().windowCoder());
    TupleTag<String> outputTag = new TupleTag<>("main-output");
    DoFnOperator<Integer, String, WindowedValue<String>> doFnOperator = new DoFnOperator<>(fn, "stepName", windowedValueCoder, outputTag, Collections.<TupleTag<?>>emptyList(), new DoFnOperator.DefaultOutputManagerFactory<WindowedValue<String>>(), windowingStrategy, new HashMap<Integer, PCollectionView<?>>(), /* side-input mapping */
    Collections.<PCollectionView<?>>emptyList(), /* side inputs */
    PipelineOptionsFactory.as(FlinkPipelineOptions.class), VarIntCoder.of());
    OneInputStreamOperatorTestHarness<WindowedValue<Integer>, WindowedValue<String>> testHarness = new KeyedOneInputStreamOperatorTestHarness<>(doFnOperator, new KeySelector<WindowedValue<Integer>, Integer>() {

        @Override
        public Integer getKey(WindowedValue<Integer> integerWindowedValue) throws Exception {
            return integerWindowedValue.getValue();
        }
    }, new CoderTypeInformation<>(VarIntCoder.of()));
    testHarness.open();
    testHarness.processWatermark(0);
    IntervalWindow window1 = new IntervalWindow(new Instant(0), Duration.millis(10));
    // this should not be late
    testHarness.processElement(new StreamRecord<>(WindowedValue.of(13, new Instant(0), window1, PaneInfo.NO_FIRING)));
    assertThat(this.<String>stripStreamRecordFromWindowedValue(testHarness.getOutput()), contains(WindowedValue.of("13", new Instant(0), window1, PaneInfo.NO_FIRING)));
    testHarness.getOutput().clear();
    testHarness.processWatermark(9);
    // this should still not be considered late
    testHarness.processElement(new StreamRecord<>(WindowedValue.of(17, new Instant(0), window1, PaneInfo.NO_FIRING)));
    assertThat(this.<String>stripStreamRecordFromWindowedValue(testHarness.getOutput()), contains(WindowedValue.of("17", new Instant(0), window1, PaneInfo.NO_FIRING)));
    testHarness.getOutput().clear();
    testHarness.processWatermark(10);
    // this should now be considered late
    testHarness.processElement(new StreamRecord<>(WindowedValue.of(17, new Instant(0), window1, PaneInfo.NO_FIRING)));
    assertThat(this.<String>stripStreamRecordFromWindowedValue(testHarness.getOutput()), emptyIterable());
    testHarness.close();
}
Also used : TupleTag(org.apache.beam.sdk.values.TupleTag) FlinkPipelineOptions(org.apache.beam.runners.flink.FlinkPipelineOptions) DoFnOperator(org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator) KeyedOneInputStreamOperatorTestHarness(org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness) StateSpec(org.apache.beam.sdk.state.StateSpec) WindowedValue(org.apache.beam.sdk.util.WindowedValue) IntervalWindow(org.apache.beam.sdk.transforms.windowing.IntervalWindow) Instant(org.joda.time.Instant) Duration(org.joda.time.Duration) PCollectionView(org.apache.beam.sdk.values.PCollectionView) DoFn(org.apache.beam.sdk.transforms.DoFn) Test(org.junit.Test)

Aggregations

KeyedOneInputStreamOperatorTestHarness (org.apache.flink.streaming.util.KeyedOneInputStreamOperatorTestHarness)55 Test (org.junit.Test)54 ConcurrentLinkedQueue (java.util.concurrent.ConcurrentLinkedQueue)43 Tuple2 (org.apache.flink.api.java.tuple.Tuple2)42 ExecutionConfig (org.apache.flink.api.common.ExecutionConfig)41 Watermark (org.apache.flink.streaming.api.watermark.Watermark)36 AtomicInteger (java.util.concurrent.atomic.AtomicInteger)31 TimeWindow (org.apache.flink.streaming.api.windowing.windows.TimeWindow)28 ListStateDescriptor (org.apache.flink.api.common.state.ListStateDescriptor)18 ReducingStateDescriptor (org.apache.flink.api.common.state.ReducingStateDescriptor)17 StreamRecord (org.apache.flink.streaming.runtime.streamrecord.StreamRecord)17 Tuple3 (org.apache.flink.api.java.tuple.Tuple3)14 OperatorStateHandles (org.apache.flink.streaming.runtime.tasks.OperatorStateHandles)14 TypeSerializer (org.apache.flink.api.common.typeutils.TypeSerializer)9 GlobalWindow (org.apache.flink.streaming.api.windowing.windows.GlobalWindow)9 StreamElementSerializer (org.apache.flink.streaming.runtime.streamrecord.StreamElementSerializer)9 PassThroughWindowFunction (org.apache.flink.streaming.api.functions.windowing.PassThroughWindowFunction)8 KeySelector (org.apache.flink.api.java.functions.KeySelector)4 PrepareForTest (org.powermock.core.classloader.annotations.PrepareForTest)4 Map (java.util.Map)3