Search in sources :

Example 1 with ConsumerClient

use of com.ibm.streamsx.kafka.clients.consumer.ConsumerClient in project streamsx.kafka by IBMStreams.

the class AbstractKafkaConsumerOperator method drain.

@Override
public void drain() throws Exception {
    // $NON-NLS-1$
    logger.log(DEBUG_LEVEL, ">>> DRAIN");
    long before = System.currentTimeMillis();
    final ConsumerClient consumer = consumerRef.get();
    if (consumer.isProcessing()) {
        consumer.onDrain();
    }
    // When a checkpoint is to be created, the operator must stop sending tuples by pulling messages out of the messageQueue.
    // This is achieved via acquiring a permit. In the background, more messages are pushed into the queue by a receive thread
    // incrementing the read offset.
    // For every tuple that is submitted, its next offset is stored in a data structure (offset manager).
    // On checkpoint, the offset manager is saved. On reset of the CR, the consumer starts reading at these previously saved offsets,
    // reading the messages since last checkpoint again.
    long after = System.currentTimeMillis();
    final long duration = after - before;
    getOperatorContext().getMetrics().getCustomMetric(ConsumerClient.DRAIN_TIME_MILLIS_METRIC_NAME).setValue(duration);
    if (duration > maxDrainMillis) {
        getOperatorContext().getMetrics().getCustomMetric(ConsumerClient.DRAIN_TIME_MILLIS_MAX_METRIC_NAME).setValue(duration);
        maxDrainMillis = duration;
    }
    logger.log(DEBUG_LEVEL, ">>> DRAIN took " + duration + " ms");
}
Also used : DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) NonCrKafkaConsumerClient(com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient) CrKafkaStaticAssignConsumerClient(com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient) ConsumerClient(com.ibm.streamsx.kafka.clients.consumer.ConsumerClient)

Example 2 with ConsumerClient

use of com.ibm.streamsx.kafka.clients.consumer.ConsumerClient in project streamsx.kafka by IBMStreams.

the class AbstractKafkaConsumerOperator method reset.

@Override
public void reset(Checkpoint checkpoint) throws Exception {
    final int attempt = crContext == null ? -1 : crContext.getResetAttempt();
    final long sequenceId = checkpoint.getSequenceId();
    logger.log(DEBUG_LEVEL, MsgFormatter.format(">>> RESET (ckpt id/attempt={0,number,#}/{1})", sequenceId, (crContext == null ? "-" : "" + attempt)));
    final long before = System.currentTimeMillis();
    try {
        final ObjectInputStream inputStream = checkpoint.getInputStream();
        final int chkptMagic = inputStream.readInt();
        logger.info("magic read from checkpoint: " + chkptMagic);
        ConsumerClient consumer = consumerRef.get();
        if (chkptMagic == consumer.getImplementationMagic()) {
            logger.info("checkpoint fits current ConsumerClient implementation.");
        } else {
            logger.info("checkpoint does not fit current ConsumerClient implementation. Building matching client ...");
            if (consumer.isProcessing()) {
                consumer.onShutdown(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
            }
            final ConsumerClientBuilder builder = magics.get(chkptMagic);
            final ConsumerClient newClient = builder.build();
            if (consumerRef.compareAndSet(consumer, newClient)) {
                try {
                    newClient.startConsumer();
                    logger.info(MsgFormatter.format("consumer client implementation {0} replaced by {1}", consumer.getClass().getName(), newClient.getClass().getName()));
                } catch (KafkaClientInitializationException e) {
                    logger.error(e.getLocalizedMessage(), e);
                    logger.error("root cause: " + e.getRootCause());
                    throw new KafkaOperatorResetFailedException("consumer client replacement failed", e);
                }
            } else {
                if (consumerRef.get().getImplementationMagic() != chkptMagic) {
                    logger.warn(MsgFormatter.format("consumer client replacement failed"));
                    throw new KafkaOperatorResetFailedException("consumer client replacement failed");
                }
            }
        }
        consumer = consumerRef.get();
        if (consumer.isProcessing()) {
            // it is up to the consumer client implementation to stop polling.
            consumer.onReset(checkpoint);
        }
    } catch (InterruptedException e) {
        logger.log(DEBUG_LEVEL, "RESET interrupted)");
        return;
    } finally {
        // by another PE, i.e. when relaunch count == 0 in initialize(context)
        if (resettingLatch != null)
            resettingLatch.countDown();
        final long after = System.currentTimeMillis();
        final long duration = after - before;
        logger.log(DEBUG_LEVEL, MsgFormatter.format(">>> RESET took {0,number,#} ms (ckpt id/attempt={1,number,#}/{2,number,#})", duration, sequenceId, attempt));
    }
}
Also used : KafkaClientInitializationException(com.ibm.streamsx.kafka.KafkaClientInitializationException) DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) NonCrKafkaConsumerClient(com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient) CrKafkaStaticAssignConsumerClient(com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient) ConsumerClient(com.ibm.streamsx.kafka.clients.consumer.ConsumerClient) KafkaOperatorResetFailedException(com.ibm.streamsx.kafka.KafkaOperatorResetFailedException) Checkpoint(com.ibm.streams.operator.state.Checkpoint) ObjectInputStream(java.io.ObjectInputStream) ConsumerClientBuilder(com.ibm.streamsx.kafka.clients.consumer.ConsumerClientBuilder)

Example 3 with ConsumerClient

use of com.ibm.streamsx.kafka.clients.consumer.ConsumerClient in project streamsx.kafka by IBMStreams.

the class AbstractKafkaConsumerOperator method shutdown.

/**
 * Shutdown this operator, which will interrupt the thread executing the
 * <code>produceTuples()</code> method.
 *
 * @throws Exception
 *             Operator failure, will cause the enclosing PE to terminate.
 */
public void shutdown() throws Exception {
    synchronized (monitor) {
        final OperatorContext context = getOperatorContext();
        logger.info(// $NON-NLS-1$ //$NON-NLS-2$
        "Operator " + context.getName() + " shutting down in PE: " + context.getPE().getPEId() + " in Job: " + // $NON-NLS-1$
        context.getPE().getJobId());
        shutdown.set(true);
        if (processThreadEndedLatch != null) {
            processThreadEndedLatch.await(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
            processThreadEndedLatch = null;
        }
        final ConsumerClient consumer = consumerRef.get();
        if (consumer.isProcessing()) {
            consumer.onShutdown(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
        }
        logger.info("Operator " + context.getName() + ": shutdown done");
        // Must call super.shutdown()
        super.shutdown();
    }
}
Also used : DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) NonCrKafkaConsumerClient(com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient) CrKafkaStaticAssignConsumerClient(com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient) ConsumerClient(com.ibm.streamsx.kafka.clients.consumer.ConsumerClient) OperatorContext(com.ibm.streams.operator.OperatorContext)

Example 4 with ConsumerClient

use of com.ibm.streamsx.kafka.clients.consumer.ConsumerClient in project streamsx.kafka by IBMStreams.

the class AbstractKafkaConsumerOperator method processPunctuation.

/**
 * @see com.ibm.streams.operator.AbstractOperator#processPunctuation(com.ibm.streams.operator.StreamingInput, com.ibm.streams.operator.StreamingData.Punctuation)
 */
@Override
public void processPunctuation(StreamingInput<Tuple> stream, Punctuation mark) throws Exception {
    if (mark == Punctuation.FINAL_MARKER) {
        synchronized (monitor) {
            logger.fatal("Final Marker received at input port. Tuple submission is stopped. Stop fetching records.");
            // make the processThread - the thread that submits tuples and initiates offset commit - terminate
            shutdown.set(true);
            if (processThreadEndedLatch != null) {
                processThreadEndedLatch.await(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
                processThreadEndedLatch = null;
            }
            final ConsumerClient consumer = consumerRef.get();
            consumer.sendStopPollingEvent();
            consumer.onShutdown(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
        }
    }
}
Also used : DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) NonCrKafkaConsumerClient(com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient) CrKafkaStaticAssignConsumerClient(com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient) ConsumerClient(com.ibm.streamsx.kafka.clients.consumer.ConsumerClient)

Example 5 with ConsumerClient

use of com.ibm.streamsx.kafka.clients.consumer.ConsumerClient in project streamsx.kafka by IBMStreams.

the class AbstractKafkaConsumerOperator method process.

@Override
public void process(StreamingInput<Tuple> stream, Tuple tuple) throws Exception {
    synchronized (monitor) {
        logger.info("process >>> ENTRY");
        boolean interrupted = false;
        try {
            final ConsumerClient consumer = consumerRef.get();
            logger.info("current consumer implementation: " + consumer);
            ControlPortAction actn = ControlPortAction.fromJSON(tuple.getString(0));
            final ControlPortActionType action = actn.getActionType();
            if (consumer.supports(actn)) {
                logger.info("consumer implementation supports " + action);
                consumer.onControlPortAction(actn);
            } else {
                if ((consumer instanceof DummyConsumerClient) && (action == ControlPortActionType.ADD_ASSIGNMENT || action == ControlPortActionType.ADD_SUBSCRIPTION)) {
                    logger.info("replacing ConsumerClient by a version that supports " + action);
                    // we can change the client implementation
                    if (consumer.isProcessing()) {
                        consumer.onShutdown(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
                    }
                    final ConsumerClientBuilder builder;
                    if (action == ControlPortActionType.ADD_SUBSCRIPTION) {
                        if (crContext != null) {
                            logger.error("topic subscription via control port is not supported when the operator is used in a consistent region. Ignoring " + actn.getJson());
                            nFailedControlTuples.increment();
                            logger.info("process <<< EXIT");
                            return;
                        }
                        builder = this.groupEnabledClientBuilder;
                    } else {
                        if (this.groupIdSpecified) {
                            logger.warn(MsgFormatter.format("A group.id is specified. The ''{0}'' operator " + "will NOT participate in a consumer group because the operator assigns partitions.", getOperatorContext().getName()));
                        }
                        builder = this.staticAssignClientBuilder;
                    }
                    logger.info("Using client builder: " + builder);
                    final ConsumerClient newClient = builder.build();
                    logger.info(MsgFormatter.format("consumer client {0} created", newClient.getClass().getName()));
                    try {
                        newClient.startConsumer();
                        if (consumerRef.compareAndSet(consumer, newClient)) {
                            logger.info(MsgFormatter.format("consumer client implementation {0} replaced by {1}", consumer.getClass().getName(), newClient.getClass().getName()));
                            newClient.onControlPortAction(actn);
                        } else {
                            logger.warn(MsgFormatter.format("consumer client replacement failed"));
                            newClient.onShutdown(SHUTDOWN_TIMEOUT, SHUTDOWN_TIMEOUT_TIMEUNIT);
                            nFailedControlTuples.increment();
                        }
                    } catch (KafkaClientInitializationException e) {
                        logger.error(e.getLocalizedMessage(), e);
                        logger.error("root cause: " + e.getRootCause());
                        nFailedControlTuples.increment();
                        throw e;
                    }
                } else {
                    // unsupported action
                    logger.error("Could not process control tuple. Action " + action + " is not supported by the '" + consumer.getClass().getName() + "' ConsumerClient implementation. Tuple: '" + tuple + "'");
                    nFailedControlTuples.increment();
                }
            }
        } catch (ControlPortJsonParseException e) {
            logger.error("Could not process control tuple. Parsing JSON '" + e.getJson() + "' failed.");
            logger.error(e.getLocalizedMessage(), e);
            nFailedControlTuples.increment();
        } catch (InterruptedException e) {
            // interrupted during shutdown
            interrupted = true;
            nFailedControlTuples.increment();
        } catch (Exception e) {
            e.printStackTrace();
            logger.error("Could not process control tuple: '" + tuple + "':" + e);
            logger.error(e.getLocalizedMessage(), e);
            nFailedControlTuples.increment();
        } finally {
            final ConsumerClient consumer = consumerRef.get();
            if (!interrupted && consumer.isSubscribedOrAssigned()) {
                logger.info("sendStartPollingEvent ...");
                consumer.sendStartPollingEvent();
            }
            logger.info("process <<< EXIT");
        }
    }
}
Also used : DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) KafkaClientInitializationException(com.ibm.streamsx.kafka.KafkaClientInitializationException) ControlPortJsonParseException(com.ibm.streamsx.kafka.ControlPortJsonParseException) DummyConsumerClient(com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient) NonCrKafkaConsumerClient(com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient) CrKafkaStaticAssignConsumerClient(com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient) ConsumerClient(com.ibm.streamsx.kafka.clients.consumer.ConsumerClient) ControlPortActionType(com.ibm.streamsx.kafka.clients.consumer.ControlPortActionType) ControlPortAction(com.ibm.streamsx.kafka.clients.consumer.ControlPortAction) ControlPortJsonParseException(com.ibm.streamsx.kafka.ControlPortJsonParseException) KafkaOperatorResetFailedException(com.ibm.streamsx.kafka.KafkaOperatorResetFailedException) KafkaClientInitializationException(com.ibm.streamsx.kafka.KafkaClientInitializationException) KafkaConfigurationException(com.ibm.streamsx.kafka.KafkaConfigurationException) ConsumerClientBuilder(com.ibm.streamsx.kafka.clients.consumer.ConsumerClientBuilder)

Aggregations

ConsumerClient (com.ibm.streamsx.kafka.clients.consumer.ConsumerClient)10 CrKafkaStaticAssignConsumerClient (com.ibm.streamsx.kafka.clients.consumer.CrKafkaStaticAssignConsumerClient)10 DummyConsumerClient (com.ibm.streamsx.kafka.clients.consumer.DummyConsumerClient)10 NonCrKafkaConsumerClient (com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerClient)10 Checkpoint (com.ibm.streams.operator.state.Checkpoint)3 KafkaClientInitializationException (com.ibm.streamsx.kafka.KafkaClientInitializationException)3 KafkaOperatorResetFailedException (com.ibm.streamsx.kafka.KafkaOperatorResetFailedException)3 ConsumerClientBuilder (com.ibm.streamsx.kafka.clients.consumer.ConsumerClientBuilder)3 ControlPortJsonParseException (com.ibm.streamsx.kafka.ControlPortJsonParseException)2 KafkaConfigurationException (com.ibm.streamsx.kafka.KafkaConfigurationException)2 OperatorContext (com.ibm.streams.operator.OperatorContext)1 ProcessingElement (com.ibm.streams.operator.ProcessingElement)1 StreamSchema (com.ibm.streams.operator.StreamSchema)1 RString (com.ibm.streams.operator.types.RString)1 ControlPortAction (com.ibm.streamsx.kafka.clients.consumer.ControlPortAction)1 ControlPortActionType (com.ibm.streamsx.kafka.clients.consumer.ControlPortActionType)1 NonCrKafkaConsumerGroupClient (com.ibm.streamsx.kafka.clients.consumer.NonCrKafkaConsumerGroupClient)1 KafkaOperatorProperties (com.ibm.streamsx.kafka.properties.KafkaOperatorProperties)1 ObjectInputStream (java.io.ObjectInputStream)1 CountDownLatch (java.util.concurrent.CountDownLatch)1