Search in sources :

Example 11 with MessageAndOffset

use of kafka.message.MessageAndOffset in project metron by apache.

the class KafkaComponent method readMessages.

public List<byte[]> readMessages(String topic) {
    SimpleConsumer consumer = new SimpleConsumer("localhost", 6667, 100000, 64 * 1024, "consumer");
    FetchRequest req = new FetchRequestBuilder().clientId("consumer").addFetch(topic, 0, 0, 100000).build();
    FetchResponse fetchResponse = consumer.fetch(req);
    Iterator<MessageAndOffset> results = fetchResponse.messageSet(topic, 0).iterator();
    List<byte[]> messages = new ArrayList<>();
    while (results.hasNext()) {
        ByteBuffer payload = results.next().message().payload();
        byte[] bytes = new byte[payload.limit()];
        payload.get(bytes);
        messages.add(bytes);
    }
    consumer.close();
    return messages;
}
Also used : FetchRequestBuilder(kafka.api.FetchRequestBuilder) FetchRequest(kafka.api.FetchRequest) ArrayList(java.util.ArrayList) FetchResponse(kafka.javaapi.FetchResponse) MessageAndOffset(kafka.message.MessageAndOffset) ByteBuffer(java.nio.ByteBuffer) SimpleConsumer(kafka.javaapi.consumer.SimpleConsumer)

Example 12 with MessageAndOffset

use of kafka.message.MessageAndOffset in project apex-malhar by apache.

the class KafkaSimpleConsumer method run.

@Override
public void run() {
    long offset = 0;
    while (isAlive) {
        // create a fetch request for topic “topic1”, partition 0, current offset, and fetch size of 1MB
        FetchRequest fetchRequest = new FetchRequestBuilder().clientId("default_client").addFetch("topic1", 1, offset, 1000000).build();
        // FetchRequest fetchRequest = new FetchRequest("topic1", 0, offset, 1000000);
        // get the message set from the consumer and print them out
        ByteBufferMessageSet messages = consumer.fetch(fetchRequest).messageSet("topic1", 1);
        Iterator<MessageAndOffset> itr = messages.iterator();
        while (itr.hasNext() && isAlive) {
            MessageAndOffset msg = itr.next();
            // advance the offset after consuming each message
            offset = msg.offset();
            logger.debug("consumed: {} offset: {}", byteBufferToString(msg.message().payload()).toString(), offset);
            receiveCount++;
        }
    }
}
Also used : FetchRequestBuilder(kafka.api.FetchRequestBuilder) FetchRequest(kafka.api.FetchRequest) MessageAndOffset(kafka.message.MessageAndOffset) ByteBufferMessageSet(kafka.javaapi.message.ByteBufferMessageSet)

Example 13 with MessageAndOffset

use of kafka.message.MessageAndOffset in project incubator-gobblin by apache.

the class KafkaDeserializerExtractorTest method getMockMessageAndOffset.

private ByteArrayBasedKafkaRecord getMockMessageAndOffset(ByteBuffer payload) {
    MessageAndOffset mockMessageAndOffset = mock(MessageAndOffset.class);
    Message mockMessage = mock(Message.class);
    when(mockMessage.payload()).thenReturn(payload);
    when(mockMessageAndOffset.message()).thenReturn(mockMessage);
    return new Kafka08ConsumerRecord(mockMessageAndOffset);
}
Also used : Message(kafka.message.Message) Kafka08ConsumerRecord(org.apache.gobblin.kafka.client.Kafka08ConsumerClient.Kafka08ConsumerRecord) MessageAndOffset(kafka.message.MessageAndOffset)

Example 14 with MessageAndOffset

use of kafka.message.MessageAndOffset in project apache-kafka-on-k8s by banzaicloud.

the class SimpleConsumerDemo method printMessages.

private static void printMessages(ByteBufferMessageSet messageSet) throws UnsupportedEncodingException {
    for (MessageAndOffset messageAndOffset : messageSet) {
        ByteBuffer payload = messageAndOffset.message().payload();
        byte[] bytes = new byte[payload.limit()];
        payload.get(bytes);
        System.out.println(new String(bytes, "UTF-8"));
    }
}
Also used : MessageAndOffset(kafka.message.MessageAndOffset) ByteBuffer(java.nio.ByteBuffer)

Example 15 with MessageAndOffset

use of kafka.message.MessageAndOffset in project cdap by caskdata.

the class KafkaLogProcessorPipeline method run.

@Override
protected void run() {
    runThread = Thread.currentThread();
    try {
        initializeOffsets();
        LOG.info("Kafka offsets initialize for pipeline {} as {}", name, offsets);
        Map<Integer, Future<Iterable<MessageAndOffset>>> futures = new HashMap<>();
        String topic = config.getTopic();
        lastCheckpointTime = System.currentTimeMillis();
        while (!stopped) {
            boolean hasMessageProcessed = false;
            for (Map.Entry<Integer, Future<Iterable<MessageAndOffset>>> entry : fetchAll(offsets, futures).entrySet()) {
                int partition = entry.getKey();
                try {
                    if (processMessages(topic, partition, entry.getValue())) {
                        hasMessageProcessed = true;
                    }
                } catch (IOException | KafkaException e) {
                    OUTAGE_LOG.warn("Failed to fetch or process messages from {}:{}. Will be retried in next iteration.", topic, partition, e);
                }
            }
            long now = System.currentTimeMillis();
            unSyncedEvents += appendEvents(now, false);
            long nextCheckpointDelay = trySyncAndPersistCheckpoints(now);
            // Sleep until the earliest event in the buffer is time to be written out.
            if (!hasMessageProcessed) {
                long sleepMillis = config.getEventDelayMillis();
                if (!eventQueue.isEmpty()) {
                    sleepMillis += eventQueue.first().getTimeStamp() - now;
                }
                sleepMillis = Math.min(sleepMillis, nextCheckpointDelay);
                if (sleepMillis > 0) {
                    TimeUnit.MILLISECONDS.sleep(sleepMillis);
                }
            }
        }
    } catch (InterruptedException e) {
    // Interruption means stopping the service.
    }
}
Also used : HashMap(java.util.HashMap) Int2LongOpenHashMap(it.unimi.dsi.fastutil.ints.Int2LongOpenHashMap) Int2ObjectOpenHashMap(it.unimi.dsi.fastutil.ints.Int2ObjectOpenHashMap) MessageAndOffset(kafka.message.MessageAndOffset) IOException(java.io.IOException) Checkpoint(co.cask.cdap.logging.meta.Checkpoint) Future(java.util.concurrent.Future) KafkaException(org.apache.kafka.common.KafkaException) HashMap(java.util.HashMap) Int2LongMap(it.unimi.dsi.fastutil.ints.Int2LongMap) Map(java.util.Map) Int2LongOpenHashMap(it.unimi.dsi.fastutil.ints.Int2LongOpenHashMap) Int2ObjectOpenHashMap(it.unimi.dsi.fastutil.ints.Int2ObjectOpenHashMap) Int2ObjectMap(it.unimi.dsi.fastutil.ints.Int2ObjectMap)

Aggregations

MessageAndOffset (kafka.message.MessageAndOffset)42 ByteBufferMessageSet (kafka.javaapi.message.ByteBufferMessageSet)25 ArrayList (java.util.ArrayList)14 List (java.util.List)13 IOException (java.io.IOException)9 ByteBuffer (java.nio.ByteBuffer)9 Test (org.junit.Test)8 Message (kafka.message.Message)7 FetchRequest (kafka.api.FetchRequest)6 FetchRequestBuilder (kafka.api.FetchRequestBuilder)6 FetchResponse (kafka.javaapi.FetchResponse)6 SimpleConsumer (kafka.javaapi.consumer.SimpleConsumer)6 Checkpoint (co.cask.cdap.logging.meta.Checkpoint)3 HashMap (java.util.HashMap)3 LinkedList (java.util.LinkedList)3 Map (java.util.Map)3 PartitionMetadata (kafka.javaapi.PartitionMetadata)2 SchemeAsMultiScheme (org.apache.storm.spout.SchemeAsMultiScheme)2 ILoggingEvent (ch.qos.logback.classic.spi.ILoggingEvent)1 NotFoundException (co.cask.cdap.common.NotFoundException)1