Search in sources :

Example 71 with Topology

use of org.apache.kafka.streams.Topology in project brave by openzipkin.

the class ITKafkaStreamsTracing method should_create_spans_from_stream_with_tracing_valueTransformerWithKey.

@Test
public void should_create_spans_from_stream_with_tracing_valueTransformerWithKey() {
    ValueTransformerWithKeySupplier<String, String, String> transformerSupplier = kafkaStreamsTracing.valueTransformerWithKey("transformer-1", () -> new ValueTransformerWithKey<String, String, String>() {

        ProcessorContext context;

        @Override
        public void init(ProcessorContext context) {
            this.context = context;
        }

        @Override
        public String transform(String key, String value) {
            try {
                Thread.sleep(100L);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
            return value;
        }

        @Override
        public void close() {
        }
    });
    String inputTopic = testName.getMethodName() + "-input";
    String outputTopic = testName.getMethodName() + "-output";
    StreamsBuilder builder = new StreamsBuilder();
    builder.stream(inputTopic, Consumed.with(Serdes.String(), Serdes.String())).transformValues(transformerSupplier).to(outputTopic, Produced.with(Serdes.String(), Serdes.String()));
    Topology topology = builder.build();
    KafkaStreams streams = buildKafkaStreams(topology);
    send(new ProducerRecord<>(inputTopic, TEST_KEY, TEST_VALUE));
    waitForStreamToRun(streams);
    MutableSpan spanInput = testSpanHandler.takeRemoteSpan(CONSUMER);
    assertThat(spanInput.tags()).containsEntry("kafka.topic", inputTopic);
    MutableSpan spanProcessor = testSpanHandler.takeLocalSpan();
    assertChildOf(spanProcessor, spanInput);
    MutableSpan spanOutput = testSpanHandler.takeRemoteSpan(PRODUCER);
    assertThat(spanOutput.tags()).containsEntry("kafka.topic", outputTopic);
    assertChildOf(spanOutput, spanProcessor);
    streams.close();
    streams.cleanUp();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MutableSpan(brave.handler.MutableSpan) Topology(org.apache.kafka.streams.Topology) ProcessorContext(org.apache.kafka.streams.processor.ProcessorContext) Test(org.junit.Test)

Example 72 with Topology

use of org.apache.kafka.streams.Topology in project brave by openzipkin.

the class ITKafkaStreamsTracing method should_create_spans_from_stream_with_tracing_map.

@Test
public void should_create_spans_from_stream_with_tracing_map() {
    String inputTopic = testName.getMethodName() + "-input";
    String outputTopic = testName.getMethodName() + "-output";
    StreamsBuilder builder = new StreamsBuilder();
    builder.stream(inputTopic, Consumed.with(Serdes.String(), Serdes.String())).transform(kafkaStreamsTracing.map("map-1", (key, value) -> {
        try {
            Thread.sleep(100L);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
        return KeyValue.pair(key, value);
    })).to(outputTopic, Produced.with(Serdes.String(), Serdes.String()));
    Topology topology = builder.build();
    KafkaStreams streams = buildKafkaStreams(topology);
    send(new ProducerRecord<>(inputTopic, TEST_KEY, TEST_VALUE));
    waitForStreamToRun(streams);
    MutableSpan spanInput = testSpanHandler.takeRemoteSpan(CONSUMER);
    assertThat(spanInput.tags()).containsEntry("kafka.topic", inputTopic);
    MutableSpan spanProcessor = testSpanHandler.takeLocalSpan();
    assertChildOf(spanProcessor, spanInput);
    MutableSpan spanOutput = testSpanHandler.takeRemoteSpan(PRODUCER);
    assertThat(spanOutput.tags()).containsEntry("kafka.topic", outputTopic);
    assertChildOf(spanOutput, spanProcessor);
    streams.close();
    streams.cleanUp();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MutableSpan(brave.handler.MutableSpan) Topology(org.apache.kafka.streams.Topology) Test(org.junit.Test)

Example 73 with Topology

use of org.apache.kafka.streams.Topology in project brave by openzipkin.

the class ITKafkaStreamsTracing method should_create_spans_and_propagate_extra_from_stream_with_multi_processor.

@Test
public void should_create_spans_and_propagate_extra_from_stream_with_multi_processor() {
    String inputTopic = testName.getMethodName() + "-input";
    String outputTopic = testName.getMethodName() + "-output";
    StreamsBuilder builder = new StreamsBuilder();
    builder.stream(inputTopic, Consumed.with(Serdes.String(), Serdes.String())).transformValues(kafkaStreamsTracing.peek("transform1", (o, o2) -> {
        TraceContext context = currentTraceContext.get();
        assertThat(BAGGAGE_FIELD.getValue(context)).isEqualTo("user1");
        BAGGAGE_FIELD.updateValue(context, "user2");
    })).transformValues(kafkaStreamsTracing.peek("transform2", (s, s2) -> {
        TraceContext context = currentTraceContext.get();
        assertThat(BAGGAGE_FIELD.getValue(context)).isEqualTo("user2");
    })).to(outputTopic, Produced.with(Serdes.String(), Serdes.String()));
    Topology topology = builder.build();
    KafkaStreams streams = buildKafkaStreams(topology);
    ProducerRecord<String, String> record = new ProducerRecord<>(inputTopic, TEST_KEY, TEST_VALUE);
    record.headers().add(BAGGAGE_FIELD_KEY, "user1".getBytes());
    send(record);
    waitForStreamToRun(streams);
    MutableSpan spanInput = testSpanHandler.takeRemoteSpan(CONSUMER);
    assertThat(spanInput.tags()).containsEntry("kafka.topic", inputTopic);
    MutableSpan spanTransform1 = testSpanHandler.takeLocalSpan();
    assertChildOf(spanTransform1, spanInput);
    MutableSpan spanTransform2 = testSpanHandler.takeLocalSpan();
    assertChildOf(spanTransform2, spanTransform1);
    MutableSpan spanOutput = testSpanHandler.takeRemoteSpan(PRODUCER);
    assertThat(spanOutput.tags()).containsEntry("kafka.topic", outputTopic);
    assertChildOf(spanOutput, spanTransform2);
    streams.close();
    streams.cleanUp();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StreamsConfig(org.apache.kafka.streams.StreamsConfig) TEST_KEY(brave.kafka.streams.KafkaStreamsTracingTest.TEST_KEY) Arrays(java.util.Arrays) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) ValueTransformerSupplier(org.apache.kafka.streams.kstream.ValueTransformerSupplier) Produced(org.apache.kafka.streams.kstream.Produced) Assertions.assertThat(org.assertj.core.api.Assertions.assertThat) ProcessorSupplier(org.apache.kafka.streams.processor.ProcessorSupplier) ArrayList(java.util.ArrayList) TransformerSupplier(org.apache.kafka.streams.kstream.TransformerSupplier) KafkaJunitRule(com.github.charithe.kafka.KafkaJunitRule) After(org.junit.After) Serdes(org.apache.kafka.common.serialization.Serdes) ValueTransformer(org.apache.kafka.streams.kstream.ValueTransformer) KafkaTracing(brave.kafka.clients.KafkaTracing) ClassRule(org.junit.ClassRule) PRODUCER(brave.Span.Kind.PRODUCER) Consumer(org.apache.kafka.clients.consumer.Consumer) CommonClientConfigs(org.apache.kafka.clients.CommonClientConfigs) TopicPartition(org.apache.kafka.common.TopicPartition) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Properties(java.util.Properties) Producer(org.apache.kafka.clients.producer.Producer) Consumed(org.apache.kafka.streams.kstream.Consumed) Transformer(org.apache.kafka.streams.kstream.Transformer) KeyValue(org.apache.kafka.streams.KeyValue) ConsumerConfig(org.apache.kafka.clients.consumer.ConsumerConfig) Test(org.junit.Test) MessagingTracing(brave.messaging.MessagingTracing) TEST_VALUE(brave.kafka.streams.KafkaStreamsTracingTest.TEST_VALUE) TraceContext(brave.propagation.TraceContext) ValueTransformerWithKeySupplier(org.apache.kafka.streams.kstream.ValueTransformerWithKeySupplier) Assertions.entry(org.assertj.core.api.Assertions.entry) FileNotFoundException(java.io.FileNotFoundException) EphemeralKafkaBroker(com.github.charithe.kafka.EphemeralKafkaBroker) Objects(java.util.Objects) ProcessorContext(org.apache.kafka.streams.processor.ProcessorContext) List(java.util.List) MutableSpan(brave.handler.MutableSpan) KAFKA_STREAMS_FILTERED_TAG(brave.kafka.streams.KafkaStreamsTags.KAFKA_STREAMS_FILTERED_TAG) ValueTransformerWithKey(org.apache.kafka.streams.kstream.ValueTransformerWithKey) KafkaStreams(org.apache.kafka.streams.KafkaStreams) AbstractProcessor(org.apache.kafka.streams.processor.AbstractProcessor) CONSUMER(brave.Span.Kind.CONSUMER) Topology(org.apache.kafka.streams.Topology) KafkaConsumer(org.apache.kafka.clients.consumer.KafkaConsumer) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MutableSpan(brave.handler.MutableSpan) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) TraceContext(brave.propagation.TraceContext) Topology(org.apache.kafka.streams.Topology) Test(org.junit.Test)

Example 74 with Topology

use of org.apache.kafka.streams.Topology in project brave by openzipkin.

the class ITKafkaStreamsTracing method should_throw_sneaky_exception_upwards.

@Test
public void should_throw_sneaky_exception_upwards() {
    TransformerSupplier<String, String, KeyValue<String, String>> transformerSupplier = kafkaStreamsTracing.transformer("sneaky-exception-transformer", () -> new Transformer<String, String, KeyValue<String, String>>() {

        ProcessorContext context;

        @Override
        public void init(ProcessorContext context) {
            this.context = context;
        }

        @Override
        public KeyValue<String, String> transform(String key, String value) {
            doThrowUnsafely(new FileNotFoundException("file-not-found"));
            return KeyValue.pair(key, value);
        }

        @Override
        public void close() {
        }
    });
    String inputTopic = testName.getMethodName() + "-input";
    String outputTopic = testName.getMethodName() + "-output";
    StreamsBuilder builder = new StreamsBuilder();
    builder.stream(inputTopic, Consumed.with(Serdes.String(), Serdes.String())).transform(transformerSupplier).to(outputTopic, Produced.with(Serdes.String(), Serdes.String()));
    Topology topology = builder.build();
    KafkaStreams streams = buildKafkaStreams(topology);
    send(new ProducerRecord<>(inputTopic, TEST_KEY, TEST_VALUE));
    waitForStreamToRun(streams);
    MutableSpan spanInput = testSpanHandler.takeRemoteSpan(CONSUMER);
    assertThat(spanInput.tags()).containsEntry("kafka.topic", inputTopic);
    MutableSpan spanProcessor = testSpanHandler.takeLocalSpan();
    assertThat(spanProcessor.error()).hasMessage("file-not-found");
    assertChildOf(spanProcessor, spanInput);
    assertThat(!streams.state().isRunningOrRebalancing());
    streams.close();
    streams.cleanUp();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) KafkaStreams(org.apache.kafka.streams.KafkaStreams) KeyValue(org.apache.kafka.streams.KeyValue) MutableSpan(brave.handler.MutableSpan) FileNotFoundException(java.io.FileNotFoundException) Topology(org.apache.kafka.streams.Topology) ProcessorContext(org.apache.kafka.streams.processor.ProcessorContext) Test(org.junit.Test)

Example 75 with Topology

use of org.apache.kafka.streams.Topology in project brave by openzipkin.

the class ITKafkaStreamsTracing method should_create_spans_from_stream_with_tracing_mark_as_not_filtered_predicate_false.

@Test
public void should_create_spans_from_stream_with_tracing_mark_as_not_filtered_predicate_false() {
    String inputTopic = testName.getMethodName() + "-input";
    String outputTopic = testName.getMethodName() + "-output";
    StreamsBuilder builder = new StreamsBuilder();
    builder.stream(inputTopic, Consumed.with(Serdes.String(), Serdes.String())).transformValues(kafkaStreamsTracing.markAsNotFiltered("filterNot-2", (key, value) -> false)).filterNot((k, v) -> Objects.isNull(v)).to(outputTopic, Produced.with(Serdes.String(), Serdes.String()));
    Topology topology = builder.build();
    KafkaStreams streams = buildKafkaStreams(topology);
    send(new ProducerRecord<>(inputTopic, TEST_KEY, TEST_VALUE));
    waitForStreamToRun(streams);
    MutableSpan spanInput = testSpanHandler.takeRemoteSpan(CONSUMER);
    assertThat(spanInput.tags()).containsEntry("kafka.topic", inputTopic);
    MutableSpan spanProcessor = testSpanHandler.takeLocalSpan();
    assertChildOf(spanProcessor, spanInput);
    assertThat(spanProcessor.tags()).containsEntry(KAFKA_STREAMS_FILTERED_TAG, "false");
    // the filter transformer returns true so record is not dropped
    MutableSpan spanOutput = testSpanHandler.takeRemoteSpan(PRODUCER);
    assertThat(spanOutput.tags()).containsEntry("kafka.topic", outputTopic);
    assertChildOf(spanOutput, spanProcessor);
    streams.close();
    streams.cleanUp();
}
Also used : StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) StreamsConfig(org.apache.kafka.streams.StreamsConfig) TEST_KEY(brave.kafka.streams.KafkaStreamsTracingTest.TEST_KEY) Arrays(java.util.Arrays) ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) ValueTransformerSupplier(org.apache.kafka.streams.kstream.ValueTransformerSupplier) Produced(org.apache.kafka.streams.kstream.Produced) Assertions.assertThat(org.assertj.core.api.Assertions.assertThat) ProcessorSupplier(org.apache.kafka.streams.processor.ProcessorSupplier) ArrayList(java.util.ArrayList) TransformerSupplier(org.apache.kafka.streams.kstream.TransformerSupplier) KafkaJunitRule(com.github.charithe.kafka.KafkaJunitRule) After(org.junit.After) Serdes(org.apache.kafka.common.serialization.Serdes) ValueTransformer(org.apache.kafka.streams.kstream.ValueTransformer) KafkaTracing(brave.kafka.clients.KafkaTracing) ClassRule(org.junit.ClassRule) PRODUCER(brave.Span.Kind.PRODUCER) Consumer(org.apache.kafka.clients.consumer.Consumer) CommonClientConfigs(org.apache.kafka.clients.CommonClientConfigs) TopicPartition(org.apache.kafka.common.TopicPartition) StreamsBuilder(org.apache.kafka.streams.StreamsBuilder) Properties(java.util.Properties) Producer(org.apache.kafka.clients.producer.Producer) Consumed(org.apache.kafka.streams.kstream.Consumed) Transformer(org.apache.kafka.streams.kstream.Transformer) KeyValue(org.apache.kafka.streams.KeyValue) ConsumerConfig(org.apache.kafka.clients.consumer.ConsumerConfig) Test(org.junit.Test) MessagingTracing(brave.messaging.MessagingTracing) TEST_VALUE(brave.kafka.streams.KafkaStreamsTracingTest.TEST_VALUE) TraceContext(brave.propagation.TraceContext) ValueTransformerWithKeySupplier(org.apache.kafka.streams.kstream.ValueTransformerWithKeySupplier) Assertions.entry(org.assertj.core.api.Assertions.entry) FileNotFoundException(java.io.FileNotFoundException) EphemeralKafkaBroker(com.github.charithe.kafka.EphemeralKafkaBroker) Objects(java.util.Objects) ProcessorContext(org.apache.kafka.streams.processor.ProcessorContext) List(java.util.List) MutableSpan(brave.handler.MutableSpan) KAFKA_STREAMS_FILTERED_TAG(brave.kafka.streams.KafkaStreamsTags.KAFKA_STREAMS_FILTERED_TAG) ValueTransformerWithKey(org.apache.kafka.streams.kstream.ValueTransformerWithKey) KafkaStreams(org.apache.kafka.streams.KafkaStreams) AbstractProcessor(org.apache.kafka.streams.processor.AbstractProcessor) CONSUMER(brave.Span.Kind.CONSUMER) Topology(org.apache.kafka.streams.Topology) KafkaConsumer(org.apache.kafka.clients.consumer.KafkaConsumer) KafkaStreams(org.apache.kafka.streams.KafkaStreams) MutableSpan(brave.handler.MutableSpan) Topology(org.apache.kafka.streams.Topology) Test(org.junit.Test)

Aggregations

Topology (org.apache.kafka.streams.Topology)127 Test (org.junit.Test)106 StreamsBuilder (org.apache.kafka.streams.StreamsBuilder)93 KafkaStreams (org.apache.kafka.streams.KafkaStreams)53 TopologyTestDriver (org.apache.kafka.streams.TopologyTestDriver)53 Properties (java.util.Properties)47 StringSerializer (org.apache.kafka.common.serialization.StringSerializer)46 KeyValue (org.apache.kafka.streams.KeyValue)40 Serdes (org.apache.kafka.common.serialization.Serdes)39 StreamsConfig (org.apache.kafka.streams.StreamsConfig)33 List (java.util.List)29 MutableSpan (brave.handler.MutableSpan)28 Consumed (org.apache.kafka.streams.kstream.Consumed)28 Produced (org.apache.kafka.streams.kstream.Produced)26 Arrays (java.util.Arrays)25 StringDeserializer (org.apache.kafka.common.serialization.StringDeserializer)25 ArrayList (java.util.ArrayList)23 ProcessorContext (org.apache.kafka.streams.processor.ProcessorContext)23 Duration (java.time.Duration)22 KStream (org.apache.kafka.streams.kstream.KStream)22