Search in sources :

Example 41 with ConnectException

use of org.apache.kafka.connect.errors.ConnectException in project ignite by apache.

the class IgniteSinkConnector method start.

/**
     * A sink lifecycle method. Validates grid-specific sink properties.
     *
     * @param props Sink properties.
     */
@Override
public void start(Map<String, String> props) {
    configProps = props;
    try {
        A.notNullOrEmpty(configProps.get(SinkConnector.TOPICS_CONFIG), "topics");
        A.notNullOrEmpty(configProps.get(IgniteSinkConstants.CACHE_NAME), "cache name");
        A.notNullOrEmpty(configProps.get(IgniteSinkConstants.CACHE_CFG_PATH), "path to cache config file");
    } catch (IllegalArgumentException e) {
        throw new ConnectException("Cannot start IgniteSinkConnector due to configuration error", e);
    }
}
Also used : ConnectException(org.apache.kafka.connect.errors.ConnectException)

Example 42 with ConnectException

use of org.apache.kafka.connect.errors.ConnectException in project ignite by apache.

the class IgniteSinkTask method start.

/**
     * Initializes grid client from configPath.
     *
     * @param props Task properties.
     */
@Override
public void start(Map<String, String> props) {
    // Each task has the same parameters -- avoid setting more than once.
    if (cacheName != null)
        return;
    cacheName = props.get(IgniteSinkConstants.CACHE_NAME);
    igniteConfigFile = props.get(IgniteSinkConstants.CACHE_CFG_PATH);
    if (props.containsKey(IgniteSinkConstants.CACHE_ALLOW_OVERWRITE))
        StreamerContext.getStreamer().allowOverwrite(Boolean.parseBoolean(props.get(IgniteSinkConstants.CACHE_ALLOW_OVERWRITE)));
    if (props.containsKey(IgniteSinkConstants.CACHE_PER_NODE_DATA_SIZE))
        StreamerContext.getStreamer().perNodeBufferSize(Integer.parseInt(props.get(IgniteSinkConstants.CACHE_PER_NODE_DATA_SIZE)));
    if (props.containsKey(IgniteSinkConstants.CACHE_PER_NODE_PAR_OPS))
        StreamerContext.getStreamer().perNodeParallelOperations(Integer.parseInt(props.get(IgniteSinkConstants.CACHE_PER_NODE_PAR_OPS)));
    if (props.containsKey(IgniteSinkConstants.SINGLE_TUPLE_EXTRACTOR_CLASS)) {
        String transformerCls = props.get(IgniteSinkConstants.SINGLE_TUPLE_EXTRACTOR_CLASS);
        if (transformerCls != null && !transformerCls.isEmpty()) {
            try {
                Class<? extends StreamSingleTupleExtractor> clazz = (Class<? extends StreamSingleTupleExtractor<SinkRecord, Object, Object>>) Class.forName(transformerCls);
                extractor = clazz.newInstance();
            } catch (Exception e) {
                throw new ConnectException("Failed to instantiate the provided transformer!", e);
            }
        }
    }
    stopped = false;
}
Also used : StreamSingleTupleExtractor(org.apache.ignite.stream.StreamSingleTupleExtractor) ConnectException(org.apache.kafka.connect.errors.ConnectException) ConnectException(org.apache.kafka.connect.errors.ConnectException)

Aggregations

ConnectException (org.apache.kafka.connect.errors.ConnectException)42 HashMap (java.util.HashMap)7 Map (java.util.Map)7 ArrayList (java.util.ArrayList)6 TimeoutException (java.util.concurrent.TimeoutException)6 IOException (java.io.IOException)5 Connector (org.apache.kafka.connect.connector.Connector)5 ExecutionException (java.util.concurrent.ExecutionException)4 NotFoundException (org.apache.kafka.connect.errors.NotFoundException)4 ConnectorTaskId (org.apache.kafka.connect.util.ConnectorTaskId)4 Test (org.junit.Test)4 PrepareForTest (org.powermock.core.classloader.annotations.PrepareForTest)4 ByteBuffer (java.nio.ByteBuffer)3 AlreadyExistsException (org.apache.kafka.connect.errors.AlreadyExistsException)3 BadRequestException (org.apache.kafka.connect.runtime.rest.errors.BadRequestException)3 SinkRecord (org.apache.kafka.connect.sink.SinkRecord)3 SourceRecord (org.apache.kafka.connect.source.SourceRecord)3 ThreadedTest (org.apache.kafka.connect.util.ThreadedTest)3 BufferedReader (java.io.BufferedReader)2 FileInputStream (java.io.FileInputStream)2