Search in sources :

Example 1 with MockKeyValueMapper

use of org.apache.kafka.test.MockKeyValueMapper in project kafka by apache.

the class InternalTopicIntegrationTest method shouldCompactTopicsForStateChangelogs.

@Test
public void shouldCompactTopicsForStateChangelogs() throws Exception {
    //
    // Step 1: Configure and start a simple word count topology
    //
    final Serde<String> stringSerde = Serdes.String();
    final Serde<Long> longSerde = Serdes.Long();
    final Properties streamsConfiguration = new Properties();
    streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "compact-topics-integration-test");
    streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, CLUSTER.bootstrapServers());
    streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
    streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
    streamsConfiguration.put(StreamsConfig.STATE_DIR_CONFIG, TestUtils.tempDirectory().getPath());
    streamsConfiguration.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    final KStreamBuilder builder = new KStreamBuilder();
    final KStream<String, String> textLines = builder.stream(DEFAULT_INPUT_TOPIC);
    final KStream<String, Long> wordCounts = textLines.flatMapValues(new ValueMapper<String, Iterable<String>>() {

        @Override
        public Iterable<String> apply(final String value) {
            return Arrays.asList(value.toLowerCase(Locale.getDefault()).split("\\W+"));
        }
    }).groupBy(MockKeyValueMapper.<String, String>SelectValueMapper()).count("Counts").toStream();
    wordCounts.to(stringSerde, longSerde, DEFAULT_OUTPUT_TOPIC);
    // Remove any state from previous test runs
    IntegrationTestUtils.purgeLocalStreamsState(streamsConfiguration);
    final KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
    streams.start();
    //
    // Step 2: Produce some input data to the input topic.
    //
    produceData(Arrays.asList("hello", "world", "world", "hello world"));
    //
    // Step 3: Verify the state changelog topics are compact
    //
    streams.close();
    final Properties properties = getTopicConfigProperties(ProcessorStateManager.storeChangelogTopic(applicationId, "Counts"));
    assertEquals(LogConfig.Compact(), properties.getProperty(LogConfig.CleanupPolicyProp()));
}
Also used : KStreamBuilder(org.apache.kafka.streams.kstream.KStreamBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MockKeyValueMapper(org.apache.kafka.test.MockKeyValueMapper) ValueMapper(org.apache.kafka.streams.kstream.ValueMapper) Properties(java.util.Properties) Test(org.junit.Test)

Example 2 with MockKeyValueMapper

use of org.apache.kafka.test.MockKeyValueMapper in project kafka by apache.

the class QueryableStateIntegrationTest method createCountStream.

/**
     * Creates a typical word count topology
     *
     * @param inputTopic
     * @param outputTopic
     * @param streamsConfiguration config
     * @return
     */
private KafkaStreams createCountStream(final String inputTopic, final String outputTopic, final Properties streamsConfiguration) {
    final KStreamBuilder builder = new KStreamBuilder();
    final Serde<String> stringSerde = Serdes.String();
    final KStream<String, String> textLines = builder.stream(stringSerde, stringSerde, inputTopic);
    final KGroupedStream<String, String> groupedByWord = textLines.flatMapValues(new ValueMapper<String, Iterable<String>>() {

        @Override
        public Iterable<String> apply(final String value) {
            return Arrays.asList(value.toLowerCase(Locale.getDefault()).split("\\W+"));
        }
    }).groupBy(MockKeyValueMapper.<String, String>SelectValueMapper());
    // Create a State Store for the all time word count
    groupedByWord.count("word-count-store-" + inputTopic).to(Serdes.String(), Serdes.Long(), outputTopic);
    // Create a Windowed State Store that contains the word count for every 1 minute
    groupedByWord.count(TimeWindows.of(WINDOW_SIZE), "windowed-word-count-store-" + inputTopic);
    return new KafkaStreams(builder, streamsConfiguration);
}
Also used : KStreamBuilder(org.apache.kafka.streams.kstream.KStreamBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MockKeyValueMapper(org.apache.kafka.test.MockKeyValueMapper) ValueMapper(org.apache.kafka.streams.kstream.ValueMapper)

Example 3 with MockKeyValueMapper

use of org.apache.kafka.test.MockKeyValueMapper in project kafka by apache.

the class InternalTopicIntegrationTest method shouldUseCompactAndDeleteForWindowStoreChangelogs.

@Test
public void shouldUseCompactAndDeleteForWindowStoreChangelogs() throws Exception {
    KStreamBuilder builder = new KStreamBuilder();
    KStream<String, String> textLines = builder.stream(DEFAULT_INPUT_TOPIC);
    final int durationMs = 2000;
    textLines.flatMapValues(new ValueMapper<String, Iterable<String>>() {

        @Override
        public Iterable<String> apply(String value) {
            return Arrays.asList(value.toLowerCase(Locale.getDefault()).split("\\W+"));
        }
    }).groupBy(MockKeyValueMapper.<String, String>SelectValueMapper()).count(TimeWindows.of(1000).until(durationMs), "CountWindows").toStream();
    // Remove any state from previous test runs
    IntegrationTestUtils.purgeLocalStreamsState(streamsConfiguration);
    KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
    streams.start();
    //
    // Step 2: Produce some input data to the input topic.
    //
    produceData(Arrays.asList("hello", "world", "world", "hello world"));
    //
    // Step 3: Verify the state changelog topics are compact
    //
    streams.close();
    final Properties properties = getTopicConfigProperties(ProcessorStateManager.storeChangelogTopic(applicationId, "CountWindows"));
    final List<String> policies = Arrays.asList(properties.getProperty(LogConfig.CleanupPolicyProp()).split(","));
    assertEquals(2, policies.size());
    assertTrue(policies.contains(LogConfig.Compact()));
    assertTrue(policies.contains(LogConfig.Delete()));
    // retention should be 1 day + the window duration
    final long retention = TimeUnit.MILLISECONDS.convert(1, TimeUnit.DAYS) + durationMs;
    assertEquals(retention, Long.parseLong(properties.getProperty(LogConfig.RetentionMsProp())));
}
Also used : KStreamBuilder(org.apache.kafka.streams.kstream.KStreamBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MockKeyValueMapper(org.apache.kafka.test.MockKeyValueMapper) ValueMapper(org.apache.kafka.streams.kstream.ValueMapper) Properties(java.util.Properties) Test(org.junit.Test)

Aggregations

KafkaStreams (org.apache.kafka.streams.KafkaStreams)3 KStreamBuilder (org.apache.kafka.streams.kstream.KStreamBuilder)3 ValueMapper (org.apache.kafka.streams.kstream.ValueMapper)3 MockKeyValueMapper (org.apache.kafka.test.MockKeyValueMapper)3 Properties (java.util.Properties)2 Test (org.junit.Test)2