Search in sources :

Example 1 with Customer

use of io.confluent.examples.streams.avro.Customer in project kafka-streams-examples by confluentinc.

the class GlobalKTablesExample method createStreams.

public static KafkaStreams createStreams(final String bootstrapServers, final String schemaRegistryUrl, final String stateDir) {
    final Properties streamsConfiguration = new Properties();
    // Give the Streams application a unique name.  The name must be unique in the Kafka cluster
    // against which the application is run.
    streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "global-tables-example");
    streamsConfiguration.put(StreamsConfig.CLIENT_ID_CONFIG, "global-tables-example-client");
    // Where to find Kafka broker(s).
    streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    streamsConfiguration.put(StreamsConfig.STATE_DIR_CONFIG, stateDir);
    // Set to earliest so we don't miss any data that arrived in the topics before the process
    // started
    streamsConfiguration.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    // create and configure the SpecificAvroSerdes required in this example
    final SpecificAvroSerde<Order> orderSerde = new SpecificAvroSerde<>();
    final Map<String, String> serdeConfig = Collections.singletonMap(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl);
    orderSerde.configure(serdeConfig, false);
    final SpecificAvroSerde<Customer> customerSerde = new SpecificAvroSerde<>();
    customerSerde.configure(serdeConfig, false);
    final SpecificAvroSerde<Product> productSerde = new SpecificAvroSerde<>();
    productSerde.configure(serdeConfig, false);
    final SpecificAvroSerde<EnrichedOrder> enrichedOrdersSerde = new SpecificAvroSerde<>();
    enrichedOrdersSerde.configure(serdeConfig, false);
    final StreamsBuilder builder = new StreamsBuilder();
    // Get the stream of orders
    final KStream<Long, Order> ordersStream = builder.stream(ORDER_TOPIC, Consumed.with(Serdes.Long(), orderSerde));
    // Create a global table for customers. The data from this global table
    // will be fully replicated on each instance of this application.
    final GlobalKTable<Long, Customer> customers = builder.globalTable(CUSTOMER_TOPIC, Materialized.<Long, Customer, KeyValueStore<Bytes, byte[]>>as(CUSTOMER_STORE).withKeySerde(Serdes.Long()).withValueSerde(customerSerde));
    // Create a global table for products. The data from this global table
    // will be fully replicated on each instance of this application.
    final GlobalKTable<Long, Product> products = builder.globalTable(PRODUCT_TOPIC, Materialized.<Long, Product, KeyValueStore<Bytes, byte[]>>as(PRODUCT_STORE).withKeySerde(Serdes.Long()).withValueSerde(productSerde));
    // Join the orders stream to the customer global table. As this is global table
    // we can use a non-key based join with out needing to repartition the input stream
    final KStream<Long, CustomerOrder> customerOrdersStream = ordersStream.join(customers, (orderId, order) -> order.getCustomerId(), (order, customer) -> new CustomerOrder(customer, order));
    // Join the enriched customer order stream with the product global table. As this is global table
    // we can use a non-key based join without needing to repartition the input stream
    final KStream<Long, EnrichedOrder> enrichedOrdersStream = customerOrdersStream.join(products, (orderId, customerOrder) -> customerOrder.productId(), (customerOrder, product) -> new EnrichedOrder(product, customerOrder.customer, customerOrder.order));
    // write the enriched order to the enriched-order topic
    enrichedOrdersStream.to(ENRICHED_ORDER_TOPIC, Produced.with(Serdes.Long(), enrichedOrdersSerde));
    return new KafkaStreams(builder.build(), new StreamsConfig(streamsConfiguration));
}
Also used : EnrichedOrder(io.confluent.examples.streams.avro.EnrichedOrder) Order(io.confluent.examples.streams.avro.Order) KafkaStreams(org.apache.kafka.streams.KafkaStreams) Customer(io.confluent.examples.streams.avro.Customer) Product(io.confluent.examples.streams.avro.Product) Properties(java.util.Properties) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Bytes(org.apache.kafka.common.utils.Bytes) SpecificAvroSerde(io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde) EnrichedOrder(io.confluent.examples.streams.avro.EnrichedOrder) StreamsConfig(org.apache.kafka.streams.StreamsConfig)

Example 2 with Customer

use of io.confluent.examples.streams.avro.Customer in project kafka-streams-examples by confluentinc.

the class GlobalKTablesExampleTest method shouldDemonstrateGlobalKTableJoins.

@Test
public void shouldDemonstrateGlobalKTableJoins() throws Exception {
    final List<Customer> customers = GlobalKTablesExampleDriver.generateCustomers(CLUSTER.bootstrapServers(), CLUSTER.schemaRegistryUrl(), 100);
    final List<Product> products = GlobalKTablesExampleDriver.generateProducts(CLUSTER.bootstrapServers(), CLUSTER.schemaRegistryUrl(), 100);
    final List<Order> orders = GlobalKTablesExampleDriver.generateOrders(CLUSTER.bootstrapServers(), CLUSTER.schemaRegistryUrl(), 100, 100, 50);
    // start up the streams instances
    streamInstanceOne.start();
    streamInstanceTwo.start();
    final Properties consumerProps = new Properties();
    consumerProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, CLUSTER.bootstrapServers());
    consumerProps.put(ConsumerConfig.GROUP_ID_CONFIG, "global-tables-consumer");
    consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, Serdes.Long().deserializer().getClass());
    consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
    consumerProps.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, CLUSTER.schemaRegistryUrl());
    consumerProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true);
    // receive the enriched orders
    final List<EnrichedOrder> enrichedOrders = IntegrationTestUtils.waitUntilMinValuesRecordsReceived(consumerProps, ENRICHED_ORDER_TOPIC, 50, 60000);
    // verify that all the data comes from the generated set
    for (final EnrichedOrder enrichedOrder : enrichedOrders) {
        assertThat(customers, hasItem(enrichedOrder.getCustomer()));
        assertThat(products, hasItem(enrichedOrder.getProduct()));
        assertThat(orders, hasItem(enrichedOrder.getOrder()));
    }
    // demonstrate that global table data is available on all instances
    verifyAllCustomersInStore(customers, streamInstanceOne.store(CUSTOMER_STORE, QueryableStoreTypes.keyValueStore()));
    verifyAllCustomersInStore(customers, streamInstanceTwo.store(CUSTOMER_STORE, QueryableStoreTypes.keyValueStore()));
    verifyAllProductsInStore(products, streamInstanceOne.store(PRODUCT_STORE, QueryableStoreTypes.keyValueStore()));
    verifyAllProductsInStore(products, streamInstanceTwo.store(PRODUCT_STORE, QueryableStoreTypes.keyValueStore()));
}
Also used : Order(io.confluent.examples.streams.avro.Order) EnrichedOrder(io.confluent.examples.streams.avro.EnrichedOrder) Customer(io.confluent.examples.streams.avro.Customer) Product(io.confluent.examples.streams.avro.Product) Properties(java.util.Properties) EnrichedOrder(io.confluent.examples.streams.avro.EnrichedOrder) Test(org.junit.Test)

Example 3 with Customer

use of io.confluent.examples.streams.avro.Customer in project kafka-streams-examples by confluentinc.

the class GlobalKTablesExampleDriver method generateCustomers.

static List<Customer> generateCustomers(final String bootstrapServers, final String schemaRegistryUrl, final int count) {
    final SpecificAvroSerde<Customer> customerSerde = createSerde(schemaRegistryUrl);
    final Properties producerProperties = new Properties();
    producerProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    final KafkaProducer<Long, Customer> customerProducer = new KafkaProducer<>(producerProperties, Serdes.Long().serializer(), customerSerde.serializer());
    final List<Customer> allCustomers = new ArrayList<>();
    final String[] genders = { "male", "female", "unknown" };
    final Random random = new Random();
    for (long i = 0; i < count; i++) {
        final Customer customer = new Customer(randomString(10), genders[random.nextInt(genders.length)], randomString(20));
        allCustomers.add(customer);
        customerProducer.send(new ProducerRecord<>(CUSTOMER_TOPIC, i, customer));
    }
    customerProducer.close();
    return allCustomers;
}
Also used : KafkaProducer(org.apache.kafka.clients.producer.KafkaProducer) Customer(io.confluent.examples.streams.avro.Customer) ArrayList(java.util.ArrayList) Properties(java.util.Properties) Random(java.util.Random)

Aggregations

Customer (io.confluent.examples.streams.avro.Customer)3 Properties (java.util.Properties)3 EnrichedOrder (io.confluent.examples.streams.avro.EnrichedOrder)2 Order (io.confluent.examples.streams.avro.Order)2 Product (io.confluent.examples.streams.avro.Product)2 SpecificAvroSerde (io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde)1 ArrayList (java.util.ArrayList)1 Random (java.util.Random)1 KafkaProducer (org.apache.kafka.clients.producer.KafkaProducer)1 Bytes (org.apache.kafka.common.utils.Bytes)1 KafkaStreams (org.apache.kafka.streams.KafkaStreams)1 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)1 StreamsConfig (org.apache.kafka.streams.StreamsConfig)1 Test (org.junit.Test)1