Search in sources :

Example 1 with S3Storage

use of io.confluent.connect.s3.storage.S3Storage in project kafka-connect-storage-cloud by confluentinc.

the class DataWriterJsonTest method setUp.

// @Before should be omitted in order to be able to add properties per test.
public void setUp() throws Exception {
    super.setUp();
    converter = new JsonConverter();
    converter.configure(Collections.singletonMap("schemas.enable", "false"), false);
    s3 = newS3Client(connectorConfig);
    storage = new S3Storage(connectorConfig, url, S3_TEST_BUCKET_NAME, s3);
    partitioner = new DefaultPartitioner<>();
    partitioner.configure(parsedConfig);
    format = new JsonFormat(storage);
    s3.createBucket(S3_TEST_BUCKET_NAME);
    assertTrue(s3.doesBucketExist(S3_TEST_BUCKET_NAME));
}
Also used : JsonFormat(io.confluent.connect.s3.format.json.JsonFormat) JsonConverter(org.apache.kafka.connect.json.JsonConverter) S3Storage(io.confluent.connect.s3.storage.S3Storage)

Example 2 with S3Storage

use of io.confluent.connect.s3.storage.S3Storage in project kafka-connect-storage-cloud by confluentinc.

the class S3SinkTask method start.

public void start(Map<String, String> props) {
    try {
        connectorConfig = new S3SinkConnectorConfig(props);
        url = connectorConfig.getString(StorageCommonConfig.STORE_URL_CONFIG);
        @SuppressWarnings("unchecked") Class<? extends S3Storage> storageClass = (Class<? extends S3Storage>) connectorConfig.getClass(StorageCommonConfig.STORAGE_CLASS_CONFIG);
        storage = StorageFactory.createStorage(storageClass, S3SinkConnectorConfig.class, connectorConfig, url);
        if (!storage.bucketExists()) {
            throw new DataException("No-existent S3 bucket: " + connectorConfig.getBucketName());
        }
        writerProvider = newFormat().getRecordWriterProvider();
        partitioner = newPartitioner(connectorConfig);
        open(context.assignment());
        log.info("Started S3 connector task with assigned partitions: {}", assignment);
    } catch (ClassNotFoundException | IllegalAccessException | InstantiationException | InvocationTargetException | NoSuchMethodException e) {
        throw new ConnectException("Reflection exception: ", e);
    } catch (AmazonClientException e) {
        throw new ConnectException(e);
    }
}
Also used : AmazonClientException(com.amazonaws.AmazonClientException) InvocationTargetException(java.lang.reflect.InvocationTargetException) S3Storage(io.confluent.connect.s3.storage.S3Storage) DataException(org.apache.kafka.connect.errors.DataException) ConnectException(org.apache.kafka.connect.errors.ConnectException)

Example 3 with S3Storage

use of io.confluent.connect.s3.storage.S3Storage in project kafka-connect-storage-cloud by confluentinc.

the class DataWriterByteArrayTest method setUp.

// @Before should be omitted in order to be able to add properties per test.
public void setUp() throws Exception {
    super.setUp();
    converter = new ByteArrayConverter();
    s3 = newS3Client(connectorConfig);
    storage = new S3Storage(connectorConfig, url, S3_TEST_BUCKET_NAME, s3);
    partitioner = new DefaultPartitioner<>();
    partitioner.configure(parsedConfig);
    format = new ByteArrayFormat(storage);
    s3.createBucket(S3_TEST_BUCKET_NAME);
    assertTrue(s3.doesBucketExist(S3_TEST_BUCKET_NAME));
}
Also used : ByteArrayConverter(org.apache.kafka.connect.converters.ByteArrayConverter) ByteArrayFormat(io.confluent.connect.s3.format.bytearray.ByteArrayFormat) S3Storage(io.confluent.connect.s3.storage.S3Storage)

Example 4 with S3Storage

use of io.confluent.connect.s3.storage.S3Storage in project kafka-connect-storage-cloud by confluentinc.

the class S3ProxyTest method setUp.

@Before
public void setUp() throws Exception {
    super.setUp();
    storage = new S3Storage(connectorConfig, url, S3_TEST_BUCKET_NAME, null);
}
Also used : S3Storage(io.confluent.connect.s3.storage.S3Storage) Before(org.junit.Before)

Example 5 with S3Storage

use of io.confluent.connect.s3.storage.S3Storage in project kafka-connect-storage-cloud by confluentinc.

the class DataWriterAvroTest method setUp.

// @Before should be omitted in order to be able to add properties per test.
public void setUp() throws Exception {
    super.setUp();
    s3 = PowerMockito.spy(newS3Client(connectorConfig));
    storage = new S3Storage(connectorConfig, url, S3_TEST_BUCKET_NAME, s3);
    partitioner = new DefaultPartitioner<>();
    partitioner.configure(parsedConfig);
    format = new AvroFormat(storage);
    s3.createBucket(S3_TEST_BUCKET_NAME);
    assertTrue(s3.doesBucketExist(S3_TEST_BUCKET_NAME));
}
Also used : AvroFormat(io.confluent.connect.s3.format.avro.AvroFormat) S3Storage(io.confluent.connect.s3.storage.S3Storage)

Aggregations

S3Storage (io.confluent.connect.s3.storage.S3Storage)6 AvroFormat (io.confluent.connect.s3.format.avro.AvroFormat)2 AmazonClientException (com.amazonaws.AmazonClientException)1 ByteArrayFormat (io.confluent.connect.s3.format.bytearray.ByteArrayFormat)1 JsonFormat (io.confluent.connect.s3.format.json.JsonFormat)1 InvocationTargetException (java.lang.reflect.InvocationTargetException)1 ByteArrayConverter (org.apache.kafka.connect.converters.ByteArrayConverter)1 ConnectException (org.apache.kafka.connect.errors.ConnectException)1 DataException (org.apache.kafka.connect.errors.DataException)1 JsonConverter (org.apache.kafka.connect.json.JsonConverter)1 Before (org.junit.Before)1