Search in sources :

Example 1 with LogAndContinueExceptionHandler

use of org.apache.kafka.streams.errors.LogAndContinueExceptionHandler in project apache-kafka-on-k8s by banzaicloud.

the class GlobalStateTaskTest method shouldNotThrowStreamsExceptionWhenValueDeserializationFails.

@Test
public void shouldNotThrowStreamsExceptionWhenValueDeserializationFails() throws Exception {
    final GlobalStateUpdateTask globalStateTask2 = new GlobalStateUpdateTask(topology, context, stateMgr, new LogAndContinueExceptionHandler(), logContext);
    final byte[] key = new IntegerSerializer().serialize(topic2, 1);
    final byte[] recordValue = new LongSerializer().serialize(topic2, 10L);
    maybeDeserialize(globalStateTask2, key, recordValue, false);
}
Also used : LongSerializer(org.apache.kafka.common.serialization.LongSerializer) LogAndContinueExceptionHandler(org.apache.kafka.streams.errors.LogAndContinueExceptionHandler) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) Test(org.junit.Test)

Example 2 with LogAndContinueExceptionHandler

use of org.apache.kafka.streams.errors.LogAndContinueExceptionHandler in project apache-kafka-on-k8s by banzaicloud.

the class RecordQueueTest method shouldDropOnNegativeTimestamp.

@Test
public void shouldDropOnNegativeTimestamp() {
    final List<ConsumerRecord<byte[], byte[]>> records = Collections.singletonList(new ConsumerRecord<>("topic", 1, 1, -1L, TimestampType.CREATE_TIME, 0L, 0, 0, recordKey, recordValue));
    final RecordQueue queue = new RecordQueue(new TopicPartition(topics[0], 1), new MockSourceNode<>(topics, intDeserializer, intDeserializer), new LogAndSkipOnInvalidTimestamp(), new LogAndContinueExceptionHandler(), null, new LogContext());
    queue.addRawRecords(records);
    assertEquals(0, queue.size());
}
Also used : TopicPartition(org.apache.kafka.common.TopicPartition) LogAndSkipOnInvalidTimestamp(org.apache.kafka.streams.processor.LogAndSkipOnInvalidTimestamp) LogContext(org.apache.kafka.common.utils.LogContext) LogAndContinueExceptionHandler(org.apache.kafka.streams.errors.LogAndContinueExceptionHandler) ConsumerRecord(org.apache.kafka.clients.consumer.ConsumerRecord) Test(org.junit.Test)

Example 3 with LogAndContinueExceptionHandler

use of org.apache.kafka.streams.errors.LogAndContinueExceptionHandler in project kafka by apache.

the class GlobalStateTaskTest method shouldNotThrowStreamsExceptionWhenKeyDeserializationFailsWithSkipHandler.

@Test
public void shouldNotThrowStreamsExceptionWhenKeyDeserializationFailsWithSkipHandler() {
    final GlobalStateUpdateTask globalStateTask2 = new GlobalStateUpdateTask(logContext, topology, context, stateMgr, new LogAndContinueExceptionHandler());
    final byte[] key = new LongSerializer().serialize(topic2, 1L);
    final byte[] recordValue = new IntegerSerializer().serialize(topic2, 10);
    maybeDeserialize(globalStateTask2, key, recordValue, false);
}
Also used : LongSerializer(org.apache.kafka.common.serialization.LongSerializer) LogAndContinueExceptionHandler(org.apache.kafka.streams.errors.LogAndContinueExceptionHandler) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) Test(org.junit.Test)

Example 4 with LogAndContinueExceptionHandler

use of org.apache.kafka.streams.errors.LogAndContinueExceptionHandler in project apache-kafka-on-k8s by banzaicloud.

the class GlobalStateTaskTest method shouldNotThrowStreamsExceptionWhenKeyDeserializationFailsWithSkipHandler.

@Test
public void shouldNotThrowStreamsExceptionWhenKeyDeserializationFailsWithSkipHandler() throws Exception {
    final GlobalStateUpdateTask globalStateTask2 = new GlobalStateUpdateTask(topology, context, stateMgr, new LogAndContinueExceptionHandler(), logContext);
    final byte[] key = new LongSerializer().serialize(topic2, 1L);
    final byte[] recordValue = new IntegerSerializer().serialize(topic2, 10);
    maybeDeserialize(globalStateTask2, key, recordValue, false);
}
Also used : LongSerializer(org.apache.kafka.common.serialization.LongSerializer) LogAndContinueExceptionHandler(org.apache.kafka.streams.errors.LogAndContinueExceptionHandler) IntegerSerializer(org.apache.kafka.common.serialization.IntegerSerializer) Test(org.junit.Test)

Example 5 with LogAndContinueExceptionHandler

use of org.apache.kafka.streams.errors.LogAndContinueExceptionHandler in project apache-kafka-on-k8s by banzaicloud.

the class RecordQueueTest method shouldThrowOnNegativeTimestamp.

@Test(expected = StreamsException.class)
public void shouldThrowOnNegativeTimestamp() {
    final List<ConsumerRecord<byte[], byte[]>> records = Collections.singletonList(new ConsumerRecord<>("topic", 1, 1, -1L, TimestampType.CREATE_TIME, 0L, 0, 0, recordKey, recordValue));
    final RecordQueue queue = new RecordQueue(new TopicPartition(topics[0], 1), new MockSourceNode<>(topics, intDeserializer, intDeserializer), new FailOnInvalidTimestamp(), new LogAndContinueExceptionHandler(), null, new LogContext());
    queue.addRawRecords(records);
}
Also used : TopicPartition(org.apache.kafka.common.TopicPartition) FailOnInvalidTimestamp(org.apache.kafka.streams.processor.FailOnInvalidTimestamp) LogContext(org.apache.kafka.common.utils.LogContext) LogAndContinueExceptionHandler(org.apache.kafka.streams.errors.LogAndContinueExceptionHandler) ConsumerRecord(org.apache.kafka.clients.consumer.ConsumerRecord) Test(org.junit.Test)

Aggregations

LogAndContinueExceptionHandler (org.apache.kafka.streams.errors.LogAndContinueExceptionHandler)9 Test (org.junit.Test)8 TopicPartition (org.apache.kafka.common.TopicPartition)5 ConsumerRecord (org.apache.kafka.clients.consumer.ConsumerRecord)4 IntegerSerializer (org.apache.kafka.common.serialization.IntegerSerializer)4 LongSerializer (org.apache.kafka.common.serialization.LongSerializer)4 LogContext (org.apache.kafka.common.utils.LogContext)4 RecordHeaders (org.apache.kafka.common.header.internals.RecordHeaders)2 FailOnInvalidTimestamp (org.apache.kafka.streams.processor.FailOnInvalidTimestamp)2 LogAndSkipOnInvalidTimestamp (org.apache.kafka.streams.processor.LogAndSkipOnInvalidTimestamp)2 InternalMockProcessorContext (org.apache.kafka.test.InternalMockProcessorContext)2 AtomicLong (java.util.concurrent.atomic.AtomicLong)1 MockConsumer (org.apache.kafka.clients.consumer.MockConsumer)1 PartitionInfo (org.apache.kafka.common.PartitionInfo)1 StreamsException (org.apache.kafka.streams.errors.StreamsException)1 GlobalProcessorContextImpl (org.apache.kafka.streams.processor.internals.GlobalProcessorContextImpl)1 GlobalStateManagerImpl (org.apache.kafka.streams.processor.internals.GlobalStateManagerImpl)1 GlobalStateUpdateTask (org.apache.kafka.streams.processor.internals.GlobalStateUpdateTask)1