Search in sources :

Example 46 with KafkaException

use of org.apache.kafka.common.KafkaException in project storm by apache.

the class KafkaBoltTest method testCustomCallbackIsWrappedByDefaultCallbackBehavior.

@Test
public void testCustomCallbackIsWrappedByDefaultCallbackBehavior() {
    MockProducer<String, String> producer = new MockProducer<>(Cluster.empty(), false, null, null, null);
    KafkaBolt<String, String> bolt = makeBolt(producer);
    PreparableCallback customCallback = mock(PreparableCallback.class);
    bolt.withProducerCallback(customCallback);
    OutputCollector collector = mock(OutputCollector.class);
    TopologyContext context = mock(TopologyContext.class);
    Map<String, Object> topoConfig = new HashMap<>();
    bolt.prepare(topoConfig, context, collector);
    verify(customCallback).prepare(topoConfig, context);
    String key = "KEY";
    String value = "VALUE";
    Tuple testTuple = createTestTuple(key, value);
    bolt.execute(testTuple);
    assertThat(producer.history().size(), is(1));
    ProducerRecord<String, String> arg = producer.history().get(0);
    LOG.info("GOT {} ->", arg);
    LOG.info("{}, {}, {}", arg.topic(), arg.key(), arg.value());
    assertThat(arg.topic(), is("MY_TOPIC"));
    assertThat(arg.key(), is(key));
    assertThat(arg.value(), is(value));
    // Force a send error
    KafkaException ex = new KafkaException();
    producer.errorNext(ex);
    verify(customCallback).onCompletion(any(), eq(ex));
    verify(collector).reportError(ex);
    verify(collector).fail(testTuple);
}
Also used : OutputCollector(org.apache.storm.task.OutputCollector) MockProducer(org.apache.kafka.clients.producer.MockProducer) HashMap(java.util.HashMap) KafkaException(org.apache.kafka.common.KafkaException) TopologyContext(org.apache.storm.task.TopologyContext) Tuple(org.apache.storm.tuple.Tuple) Test(org.junit.Test)

Example 47 with KafkaException

use of org.apache.kafka.common.KafkaException in project storm by apache.

the class KafkaBoltTest method testSimpleWithError.

@Test
public void testSimpleWithError() {
    MockProducer<String, String> producer = new MockProducer<>(Cluster.empty(), false, null, null, null);
    KafkaBolt<String, String> bolt = makeBolt(producer);
    OutputCollector collector = mock(OutputCollector.class);
    TopologyContext context = mock(TopologyContext.class);
    Map<String, Object> conf = new HashMap<>();
    bolt.prepare(conf, context, collector);
    String key = "KEY";
    String value = "VALUE";
    Tuple testTuple = createTestTuple(key, value);
    bolt.execute(testTuple);
    assertThat(producer.history().size(), is(1));
    ProducerRecord<String, String> arg = producer.history().get(0);
    LOG.info("GOT {} ->", arg);
    LOG.info("{}, {}, {}", arg.topic(), arg.key(), arg.value());
    assertThat(arg.topic(), is("MY_TOPIC"));
    assertThat(arg.key(), is(key));
    assertThat(arg.value(), is(value));
    // Force a send error
    KafkaException ex = new KafkaException();
    producer.errorNext(ex);
    verify(collector).reportError(ex);
    verify(collector).fail(testTuple);
}
Also used : OutputCollector(org.apache.storm.task.OutputCollector) MockProducer(org.apache.kafka.clients.producer.MockProducer) HashMap(java.util.HashMap) KafkaException(org.apache.kafka.common.KafkaException) TopologyContext(org.apache.storm.task.TopologyContext) Tuple(org.apache.storm.tuple.Tuple) Test(org.junit.Test)

Example 48 with KafkaException

use of org.apache.kafka.common.KafkaException in project kafka by apache.

the class Sanitizer method sanitize.

/**
 * Sanitize `name` for safe use as JMX metric name as well as ZooKeeper node name
 * using URL-encoding.
 */
public static String sanitize(String name) {
    String encoded = "";
    try {
        encoded = URLEncoder.encode(name, StandardCharsets.UTF_8.name());
        StringBuilder builder = new StringBuilder();
        for (int i = 0; i < encoded.length(); i++) {
            char c = encoded.charAt(i);
            if (c == '*') {
                // Metric ObjectName treats * as pattern
                builder.append("%2A");
            } else if (c == '+') {
                // Space URL-encoded as +, replace with percent encoding
                builder.append("%20");
            } else {
                builder.append(c);
            }
        }
        return builder.toString();
    } catch (UnsupportedEncodingException e) {
        throw new KafkaException(e);
    }
}
Also used : UnsupportedEncodingException(java.io.UnsupportedEncodingException) KafkaException(org.apache.kafka.common.KafkaException)

Example 49 with KafkaException

use of org.apache.kafka.common.KafkaException in project kafka by apache.

the class FileRecordsTest method testTruncateNotCalledIfSizeIsBiggerThanTargetSize.

/**
 * Expect a KafkaException if targetSize is bigger than the size of
 * the FileRecords.
 */
@Test
public void testTruncateNotCalledIfSizeIsBiggerThanTargetSize() throws IOException {
    FileChannel channelMock = mock(FileChannel.class);
    when(channelMock.size()).thenReturn(42L);
    FileRecords fileRecords = new FileRecords(tempFile(), channelMock, 0, Integer.MAX_VALUE, false);
    try {
        fileRecords.truncateTo(43);
        fail("Should throw KafkaException");
    } catch (KafkaException e) {
    // expected
    }
    verify(channelMock, atLeastOnce()).size();
}
Also used : FileChannel(java.nio.channels.FileChannel) KafkaException(org.apache.kafka.common.KafkaException) Test(org.junit.jupiter.api.Test)

Example 50 with KafkaException

use of org.apache.kafka.common.KafkaException in project kafka by apache.

the class WorkerGroupMember method stop.

private void stop(boolean swallowException) {
    log.trace("Stopping the Connect group member.");
    AtomicReference<Throwable> firstException = new AtomicReference<>();
    this.stopped = true;
    Utils.closeQuietly(coordinator, "coordinator", firstException);
    Utils.closeQuietly(metrics, "consumer metrics", firstException);
    Utils.closeQuietly(client, "consumer network client", firstException);
    AppInfoParser.unregisterAppInfo(JMX_PREFIX, clientId, metrics);
    if (firstException.get() != null && !swallowException)
        throw new KafkaException("Failed to stop the Connect group member", firstException.get());
    else
        log.debug("The Connect group member has stopped.");
}
Also used : AtomicReference(java.util.concurrent.atomic.AtomicReference) KafkaException(org.apache.kafka.common.KafkaException)

Aggregations

KafkaException (org.apache.kafka.common.KafkaException)262 Test (org.junit.Test)69 TopicPartition (org.apache.kafka.common.TopicPartition)56 Test (org.junit.jupiter.api.Test)47 HashMap (java.util.HashMap)40 IOException (java.io.IOException)39 StreamsException (org.apache.kafka.streams.errors.StreamsException)34 Map (java.util.Map)32 TimeoutException (org.apache.kafka.common.errors.TimeoutException)28 ArrayList (java.util.ArrayList)27 List (java.util.List)21 ByteBuffer (java.nio.ByteBuffer)19 ExecutionException (java.util.concurrent.ExecutionException)19 ConfigException (org.apache.kafka.common.config.ConfigException)16 TopicAuthorizationException (org.apache.kafka.common.errors.TopicAuthorizationException)14 HashSet (java.util.HashSet)13 Properties (java.util.Properties)13 Set (java.util.Set)11 Collectors (java.util.stream.Collectors)11 RecordMetadata (org.apache.kafka.clients.producer.RecordMetadata)11