Search in sources :

Example 26 with StringDeserializer

use of org.apache.kafka.common.serialization.StringDeserializer in project kafka-streams-examples by confluentinc.

the class WikipediaFeedAvroExampleTest method shouldRunTheWikipediaFeedExample.

@Test
public void shouldRunTheWikipediaFeedExample() throws Exception {
    final Properties props = new Properties();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, CLUSTER.bootstrapServers());
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroSerializer.class);
    props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, CLUSTER.schemaRegistryUrl());
    final KafkaProducer<String, WikiFeed> producer = new KafkaProducer<>(props);
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("donna", true, "first post")));
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("donna", true, "second post")));
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("donna", true, "third post")));
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("becca", true, "first post")));
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("becca", true, "second post")));
    producer.send(new ProducerRecord<>(WikipediaFeedAvroExample.WIKIPEDIA_FEED, new WikiFeed("john", true, "first post")));
    producer.flush();
    streams.start();
    final Properties consumerProperties = new Properties();
    consumerProperties.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, CLUSTER.bootstrapServers());
    consumerProperties.put(ConsumerConfig.GROUP_ID_CONFIG, "wikipedia-feed-consumer");
    consumerProperties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    final KafkaConsumer<String, Long> consumer = new KafkaConsumer<>(consumerProperties, new StringDeserializer(), new LongDeserializer());
    final Map<String, Long> expected = new HashMap<>();
    expected.put("donna", 3L);
    expected.put("becca", 2L);
    expected.put("john", 1L);
    final Map<String, Long> actual = new HashMap<>();
    consumer.subscribe(Collections.singleton(WikipediaFeedAvroExample.WIKIPEDIA_STATS));
    final long timeout = System.currentTimeMillis() + 30000L;
    while (!actual.equals(expected) && System.currentTimeMillis() < timeout) {
        final ConsumerRecords<String, Long> records = consumer.poll(1000);
        records.forEach(record -> actual.put(record.key(), record.value()));
    }
    assertThat(expected, equalTo(actual));
}
Also used : KafkaProducer(org.apache.kafka.clients.producer.KafkaProducer) HashMap(java.util.HashMap) StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) KafkaConsumer(org.apache.kafka.clients.consumer.KafkaConsumer) Properties(java.util.Properties) LongDeserializer(org.apache.kafka.common.serialization.LongDeserializer) WikiFeed(io.confluent.examples.streams.avro.WikiFeed) Test(org.junit.Test)

Example 27 with StringDeserializer

use of org.apache.kafka.common.serialization.StringDeserializer in project tutorials by eugenp.

the class KafkaConsumerConfig method greetingConsumerFactory.

public ConsumerFactory<String, Greeting> greetingConsumerFactory() {
    Map<String, Object> props = new HashMap<>();
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "greeting");
    return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(Greeting.class));
}
Also used : HashMap(java.util.HashMap) StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) DefaultKafkaConsumerFactory(org.springframework.kafka.core.DefaultKafkaConsumerFactory)

Example 28 with StringDeserializer

use of org.apache.kafka.common.serialization.StringDeserializer in project eventapis by kloiasoft.

the class EventListenConfiguration method startOperations.

private void startOperations() {
    Map<String, Object> consumerProperties = eventApisConfiguration.getEventBus().buildConsumerProperties();
    DefaultKafkaConsumerFactory<String, Operation> operationConsumerFactory = new DefaultKafkaConsumerFactory<>(consumerProperties, new StringDeserializer(), new JsonDeserializer<>(Operation.class));
    ContainerProperties operationContainerProperties = new ContainerProperties(Operation.OPERATION_EVENTS);
    operationContainerProperties.setMessageListener(new MultipleEventMessageListener(eventMessageListeners));
    operationListenerContainer = new ConcurrentMessageListenerContainer<>(operationConsumerFactory, operationContainerProperties);
    operationListenerContainer.setBeanName("emon-operations");
    operationListenerContainer.start();
}
Also used : StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) MultipleEventMessageListener(com.kloia.eventapis.api.emon.service.MultipleEventMessageListener) ContainerProperties(org.springframework.kafka.listener.config.ContainerProperties) Operation(com.kloia.eventapis.pojos.Operation) DefaultKafkaConsumerFactory(org.springframework.kafka.core.DefaultKafkaConsumerFactory)

Example 29 with StringDeserializer

use of org.apache.kafka.common.serialization.StringDeserializer in project eventapis by kloiasoft.

the class KafkaOperationRepositoryFactory method createOperationConsumer.

public Consumer<String, Operation> createOperationConsumer(ObjectMapper objectMapper) {
    KafkaProperties properties = kafkaProperties.clone();
    properties.getConsumer().setEnableAutoCommit(false);
    return new KafkaConsumer<>(properties.buildConsumerProperties(), new StringDeserializer(), new JsonDeserializer<>(Operation.class, objectMapper));
}
Also used : StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) KafkaConsumer(org.apache.kafka.clients.consumer.KafkaConsumer) Operation(com.kloia.eventapis.pojos.Operation)

Example 30 with StringDeserializer

use of org.apache.kafka.common.serialization.StringDeserializer in project eventapis by kloiasoft.

the class KafkaOperationRepositoryFactory method createEventConsumer.

public Consumer<String, PublishedEventWrapper> createEventConsumer(ObjectMapper objectMapper) {
    KafkaProperties properties = kafkaProperties.clone();
    properties.getConsumer().setEnableAutoCommit(true);
    return new KafkaConsumer<>(properties.buildConsumerProperties(), new StringDeserializer(), new JsonDeserializer<>(PublishedEventWrapper.class, objectMapper));
}
Also used : StringDeserializer(org.apache.kafka.common.serialization.StringDeserializer) KafkaConsumer(org.apache.kafka.clients.consumer.KafkaConsumer)

Aggregations

StringDeserializer (org.apache.kafka.common.serialization.StringDeserializer)152 Test (org.junit.Test)91 StringSerializer (org.apache.kafka.common.serialization.StringSerializer)59 TopologyTestDriver (org.apache.kafka.streams.TopologyTestDriver)46 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)35 HashMap (java.util.HashMap)33 Properties (java.util.Properties)32 IntegerDeserializer (org.apache.kafka.common.serialization.IntegerDeserializer)31 Windowed (org.apache.kafka.streams.kstream.Windowed)31 List (java.util.List)29 KeyValue (org.apache.kafka.streams.KeyValue)29 IntegrationTest (org.apache.kafka.test.IntegrationTest)27 ArrayList (java.util.ArrayList)26 LongDeserializer (org.apache.kafka.common.serialization.LongDeserializer)25 Map (java.util.Map)20 KafkaConsumer (org.apache.kafka.clients.consumer.KafkaConsumer)20 IntegerSerializer (org.apache.kafka.common.serialization.IntegerSerializer)17 Serdes (org.apache.kafka.common.serialization.Serdes)17 KeyValueTimestamp (org.apache.kafka.streams.KeyValueTimestamp)17 KStream (org.apache.kafka.streams.kstream.KStream)17