Search in sources :

Example 76 with MockApiProcessorSupplier

use of org.apache.kafka.test.MockApiProcessorSupplier in project kafka by apache.

the class KStreamKStreamLeftJoinTest method testLeftExpiredNonJoinedRecordsAreEmittedByTheRightProcessor.

@Test
public void testLeftExpiredNonJoinedRecordsAreEmittedByTheRightProcessor() {
    final StreamsBuilder builder = new StreamsBuilder();
    final KStream<Integer, String> stream1;
    final KStream<Integer, String> stream2;
    final KStream<Integer, String> joined;
    final MockApiProcessorSupplier<Integer, String, Void, Void> supplier = new MockApiProcessorSupplier<>();
    stream1 = builder.stream(topic1, consumed);
    stream2 = builder.stream(topic2, consumed);
    joined = stream1.leftJoin(stream2, MockValueJoiner.TOSTRING_JOINER, JoinWindows.ofTimeDifferenceAndGrace(ofMillis(100L), ofMillis(0L)), StreamJoined.with(Serdes.Integer(), Serdes.String(), Serdes.String()));
    joined.process(supplier);
    try (final TopologyTestDriver driver = new TopologyTestDriver(builder.build(), props)) {
        final TestInputTopic<Integer, String> inputTopic1 = driver.createInputTopic(topic1, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final TestInputTopic<Integer, String> inputTopic2 = driver.createInputTopic(topic2, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final MockApiProcessor<Integer, String, Void, Void> processor = supplier.theCapturedProcessor();
        final long windowStart = 0L;
        // No joins detected; No null-joins emitted
        inputTopic1.pipeInput(0, "A0", windowStart + 1L);
        inputTopic1.pipeInput(1, "A1", windowStart + 2L);
        inputTopic1.pipeInput(0, "A0-0", windowStart + 3L);
        processor.checkAndClearProcessResult();
        // Join detected; No null-joins emitted
        inputTopic2.pipeInput(1, "a1", windowStart + 3L);
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(1, "A1+a1", windowStart + 3L));
        // Dummy record in right topic will emit expired non-joined records from the left topic
        inputTopic2.pipeInput(2, "dummy", windowStart + 401L);
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(0, "A0+null", windowStart + 1L), new KeyValueTimestamp<>(0, "A0-0+null", windowStart + 3L));
        // Flush internal non-joined state store by joining the dummy record
        inputTopic1.pipeInput(2, "dummy", windowStart + 402L);
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(2, "dummy+dummy", windowStart + 402L));
    }
}
Also used : MockApiProcessorSupplier(org.apache.kafka.test.MockApiProcessorSupplier) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) Test(org.junit.Test)

Example 77 with MockApiProcessorSupplier

use of org.apache.kafka.test.MockApiProcessorSupplier in project kafka by apache.

the class KStreamKStreamLeftJoinTest method testLeftJoinWithSpuriousResultFixDisabledOldApi.

@SuppressWarnings("deprecation")
@Test
public void testLeftJoinWithSpuriousResultFixDisabledOldApi() {
    final StreamsBuilder builder = new StreamsBuilder();
    final int[] expectedKeys = new int[] { 0, 1, 2, 3 };
    final KStream<Integer, String> stream1;
    final KStream<Integer, String> stream2;
    final KStream<Integer, String> joined;
    final MockApiProcessorSupplier<Integer, String, Void, Void> supplier = new MockApiProcessorSupplier<>();
    stream1 = builder.stream(topic1, consumed);
    stream2 = builder.stream(topic2, consumed);
    joined = stream1.leftJoin(stream2, MockValueJoiner.TOSTRING_JOINER, JoinWindows.of(ofMillis(100L)), StreamJoined.with(Serdes.Integer(), Serdes.String(), Serdes.String()));
    joined.process(supplier);
    try (final TopologyTestDriver driver = new TopologyTestDriver(builder.build(props), props)) {
        final TestInputTopic<Integer, String> inputTopic1 = driver.createInputTopic(topic1, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final TestInputTopic<Integer, String> inputTopic2 = driver.createInputTopic(topic2, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final MockApiProcessor<Integer, String, Void, Void> processor = supplier.theCapturedProcessor();
        // Only 2 window stores should be available
        assertEquals(2, driver.getAllStateStores().size());
        // --> w2 = {}
        for (int i = 0; i < 2; i++) {
            inputTopic1.pipeInput(expectedKeys[i], "A" + expectedKeys[i]);
        }
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(0, "A0+null", 0L), new KeyValueTimestamp<>(1, "A1+null", 0L));
        // --> w2 = { 0:a0, 1:a1 }
        for (int i = 0; i < 2; i++) {
            inputTopic2.pipeInput(expectedKeys[i], "a" + expectedKeys[i]);
        }
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(0, "A0+a0", 0L), new KeyValueTimestamp<>(1, "A1+a1", 0L));
    }
}
Also used : MockApiProcessorSupplier(org.apache.kafka.test.MockApiProcessorSupplier) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) Test(org.junit.Test)

Example 78 with MockApiProcessorSupplier

use of org.apache.kafka.test.MockApiProcessorSupplier in project kafka by apache.

the class KStreamKStreamLeftJoinTest method testLeftJoinDuplicates.

@Test
public void testLeftJoinDuplicates() {
    final StreamsBuilder builder = new StreamsBuilder();
    final KStream<Integer, String> stream1;
    final KStream<Integer, String> stream2;
    final KStream<Integer, String> joined;
    final MockApiProcessorSupplier<Integer, String, Void, Void> supplier = new MockApiProcessorSupplier<>();
    stream1 = builder.stream(topic1, consumed);
    stream2 = builder.stream(topic2, consumed);
    joined = stream1.leftJoin(stream2, MockValueJoiner.TOSTRING_JOINER, JoinWindows.ofTimeDifferenceAndGrace(ofMillis(100L), ofMillis(10L)), StreamJoined.with(Serdes.Integer(), Serdes.String(), Serdes.String()));
    joined.process(supplier);
    try (final TopologyTestDriver driver = new TopologyTestDriver(builder.build(), props)) {
        final TestInputTopic<Integer, String> inputTopic1 = driver.createInputTopic(topic1, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final TestInputTopic<Integer, String> inputTopic2 = driver.createInputTopic(topic2, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final MockApiProcessor<Integer, String, Void, Void> processor = supplier.theCapturedProcessor();
        // verifies non-joined duplicates are emitted when window has closed
        inputTopic1.pipeInput(0, "A0", 0L);
        inputTopic1.pipeInput(0, "A0-0", 0L);
        inputTopic2.pipeInput(1, "a0", 111L);
        // bump stream-time to trigger left-join results
        inputTopic2.pipeInput(2, "dummy", 500L);
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(0, "A0+null", 0L), new KeyValueTimestamp<>(0, "A0-0+null", 0L));
        // verifies joined duplicates are emitted
        inputTopic1.pipeInput(2, "A2", 1000L);
        inputTopic1.pipeInput(2, "A2-0", 1000L);
        inputTopic2.pipeInput(2, "a2", 1001L);
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(2, "A2+a2", 1001L), new KeyValueTimestamp<>(2, "A2-0+a2", 1001L));
        // this record should expired non-joined records, but because A2 and A2-0 are joined and
        // emitted already, then they won't be emitted again
        inputTopic2.pipeInput(3, "a3", 315L);
        processor.checkAndClearProcessResult();
    }
}
Also used : MockApiProcessorSupplier(org.apache.kafka.test.MockApiProcessorSupplier) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) Test(org.junit.Test)

Example 79 with MockApiProcessorSupplier

use of org.apache.kafka.test.MockApiProcessorSupplier in project kafka by apache.

the class KStreamKStreamLeftJoinTest method testWindowing.

@Test
public void testWindowing() {
    final StreamsBuilder builder = new StreamsBuilder();
    final int[] expectedKeys = new int[] { 0, 1, 2, 3 };
    final KStream<Integer, String> stream1;
    final KStream<Integer, String> stream2;
    final KStream<Integer, String> joined;
    final MockApiProcessorSupplier<Integer, String, Void, Void> supplier = new MockApiProcessorSupplier<>();
    stream1 = builder.stream(topic1, consumed);
    stream2 = builder.stream(topic2, consumed);
    joined = stream1.leftJoin(stream2, MockValueJoiner.TOSTRING_JOINER, JoinWindows.ofTimeDifferenceWithNoGrace(ofMillis(100)), StreamJoined.with(Serdes.Integer(), Serdes.String(), Serdes.String()));
    joined.process(supplier);
    final Collection<Set<String>> copartitionGroups = TopologyWrapper.getInternalTopologyBuilder(builder.build()).copartitionGroups();
    assertEquals(1, copartitionGroups.size());
    assertEquals(new HashSet<>(Arrays.asList(topic1, topic2)), copartitionGroups.iterator().next());
    try (final TopologyTestDriver driver = new TopologyTestDriver(builder.build(), props)) {
        final TestInputTopic<Integer, String> inputTopic1 = driver.createInputTopic(topic1, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final TestInputTopic<Integer, String> inputTopic2 = driver.createInputTopic(topic2, new IntegerSerializer(), new StringSerializer(), Instant.ofEpochMilli(0L), Duration.ZERO);
        final MockApiProcessor<Integer, String, Void, Void> processor = supplier.theCapturedProcessor();
        final long time = 0L;
        // --> w2 = {}
        for (int i = 0; i < 2; i++) {
            inputTopic1.pipeInput(expectedKeys[i], "A" + expectedKeys[i], time);
        }
        processor.checkAndClearProcessResult();
        // --> w2 = { 0:a0 (ts: 0), 1:a1 (ts: 0), 2:a2 (ts: 0), 3:a3 (ts: 0) }
        for (final int expectedKey : expectedKeys) {
            inputTopic2.pipeInput(expectedKey, "a" + expectedKey, time);
        }
        processor.checkAndClearProcessResult(new KeyValueTimestamp<>(0, "A0+a0", 0L), new KeyValueTimestamp<>(1, "A1+a1", 0L));
        testUpperWindowBound(expectedKeys, driver, processor);
        testLowerWindowBound(expectedKeys, driver, processor);
    }
}
Also used : MockApiProcessorSupplier(org.apache.kafka.test.MockApiProcessorSupplier) HashSet(java.util.HashSet) Set(java.util.Set) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) Test(org.junit.Test)

Example 80 with MockApiProcessorSupplier

use of org.apache.kafka.test.MockApiProcessorSupplier in project kafka by apache.

the class KGroupedTableImplTest method shouldReduceAndMaterializeResults.

@Test
public void shouldReduceAndMaterializeResults() {
    final KeyValueMapper<String, Number, KeyValue<String, Integer>> intProjection = (key, value) -> KeyValue.pair(key, value.intValue());
    final KTable<String, Integer> reduced = builder.table(topic, Consumed.with(Serdes.String(), Serdes.Double())).groupBy(intProjection).reduce(MockReducer.INTEGER_ADDER, MockReducer.INTEGER_SUBTRACTOR, Materialized.<String, Integer, KeyValueStore<Bytes, byte[]>>as("reduce").withKeySerde(Serdes.String()).withValueSerde(Serdes.Integer()));
    final MockApiProcessorSupplier<String, Integer, Void, Void> supplier = getReducedResults(reduced);
    try (final TopologyTestDriver driver = new TopologyTestDriver(builder.build(), props)) {
        assertReduced(supplier.theCapturedProcessor().lastValueAndTimestampPerKey(), topic, driver);
        {
            final KeyValueStore<String, Integer> reduce = driver.getKeyValueStore("reduce");
            assertThat(reduce.get("A"), equalTo(5));
            assertThat(reduce.get("B"), equalTo(6));
        }
        {
            final KeyValueStore<String, ValueAndTimestamp<Integer>> reduce = driver.getTimestampedKeyValueStore("reduce");
            assertThat(reduce.get("A"), equalTo(ValueAndTimestamp.make(5, 50L)));
            assertThat(reduce.get("B"), equalTo(ValueAndTimestamp.make(6, 30L)));
        }
    }
}
Also used : MockInitializer(org.apache.kafka.test.MockInitializer) CoreMatchers.equalTo(org.hamcrest.CoreMatchers.equalTo) Assert.assertThrows(org.junit.Assert.assertThrows) MockReducer(org.apache.kafka.test.MockReducer) TopologyException(org.apache.kafka.streams.errors.TopologyException) ValueAndTimestamp(org.apache.kafka.streams.state.ValueAndTimestamp) KGroupedTable(org.apache.kafka.streams.kstream.KGroupedTable) MockApiProcessorSupplier(org.apache.kafka.test.MockApiProcessorSupplier) KeyValueStore(org.apache.kafka.streams.state.KeyValueStore) Map(java.util.Map) Serdes(org.apache.kafka.common.serialization.Serdes) StringSerializer(org.apache.kafka.common.serialization.StringSerializer) MatcherAssert.assertThat(org.hamcrest.MatcherAssert.assertThat) Before(org.junit.Before) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) MockMapper(org.apache.kafka.test.MockMapper) KTable(org.apache.kafka.streams.kstream.KTable) KeyValueMapper(org.apache.kafka.streams.kstream.KeyValueMapper) Properties(java.util.Properties) DoubleSerializer(org.apache.kafka.common.serialization.DoubleSerializer) Consumed(org.apache.kafka.streams.kstream.Consumed) KeyValue(org.apache.kafka.streams.KeyValue) Test(org.junit.Test) Grouped(org.apache.kafka.streams.kstream.Grouped) MockAggregator(org.apache.kafka.test.MockAggregator) Bytes(org.apache.kafka.common.utils.Bytes) Assert.assertNull(org.junit.Assert.assertNull) Materialized(org.apache.kafka.streams.kstream.Materialized) TestInputTopic(org.apache.kafka.streams.TestInputTopic) StreamsTestUtils(org.apache.kafka.test.StreamsTestUtils) Assert.assertEquals(org.junit.Assert.assertEquals) Bytes(org.apache.kafka.common.utils.Bytes) KeyValue(org.apache.kafka.streams.KeyValue) TopologyTestDriver(org.apache.kafka.streams.TopologyTestDriver) KeyValueStore(org.apache.kafka.streams.state.KeyValueStore) Test(org.junit.Test)

Aggregations

MockApiProcessorSupplier (org.apache.kafka.test.MockApiProcessorSupplier)90 TopologyTestDriver (org.apache.kafka.streams.TopologyTestDriver)79 Test (org.junit.Test)78 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)68 StringSerializer (org.apache.kafka.common.serialization.StringSerializer)64 IntegerSerializer (org.apache.kafka.common.serialization.IntegerSerializer)41 Windowed (org.apache.kafka.streams.kstream.Windowed)19 KeyValueTimestamp (org.apache.kafka.streams.KeyValueTimestamp)14 Properties (java.util.Properties)13 ValueAndTimestamp (org.apache.kafka.streams.state.ValueAndTimestamp)12 HashSet (java.util.HashSet)11 Set (java.util.Set)11 Serdes (org.apache.kafka.common.serialization.Serdes)10 TestInputTopic (org.apache.kafka.streams.TestInputTopic)10 Consumed (org.apache.kafka.streams.kstream.Consumed)10 MockApiProcessor (org.apache.kafka.test.MockApiProcessor)10 StreamsTestUtils (org.apache.kafka.test.StreamsTestUtils)9 Topology (org.apache.kafka.streams.Topology)8 KeyValueStore (org.apache.kafka.streams.state.KeyValueStore)8 Assert.assertEquals (org.junit.Assert.assertEquals)8