Search in sources :

Example 81 with SourceRecord

use of org.apache.kafka.connect.source.SourceRecord in project kafka by apache.

the class FileStreamSourceTaskTest method testBatchSize.

@Test
public void testBatchSize() throws IOException, InterruptedException {
    expectOffsetLookupReturnNone();
    replay();
    config.put(FileStreamSourceConnector.TASK_BATCH_SIZE_CONFIG, "5000");
    task.start(config);
    OutputStream os = Files.newOutputStream(tempFile.toPath());
    writeTimesAndFlush(os, 10_000, "Neque porro quisquam est qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit...\n".getBytes());
    assertEquals(2, task.bufferSize());
    List<SourceRecord> records = task.poll();
    assertEquals(5000, records.size());
    assertEquals(128, task.bufferSize());
    records = task.poll();
    assertEquals(5000, records.size());
    assertEquals(128, task.bufferSize());
    os.close();
    task.stop();
}
Also used : OutputStream(java.io.OutputStream) SourceRecord(org.apache.kafka.connect.source.SourceRecord) Test(org.junit.jupiter.api.Test)

Example 82 with SourceRecord

use of org.apache.kafka.connect.source.SourceRecord in project kafka by apache.

the class MirrorHeartbeatTask method poll.

@Override
public List<SourceRecord> poll() throws InterruptedException {
    // pause to throttle, unless we've stopped
    if (stopped.await(interval.toMillis(), TimeUnit.MILLISECONDS)) {
        // SourceWorkerTask expects non-zero batches or null
        return null;
    }
    long timestamp = System.currentTimeMillis();
    Heartbeat heartbeat = new Heartbeat(sourceClusterAlias, targetClusterAlias, timestamp);
    SourceRecord record = new SourceRecord(heartbeat.connectPartition(), MirrorUtils.wrapOffset(0), heartbeatsTopic, 0, Schema.BYTES_SCHEMA, heartbeat.recordKey(), Schema.BYTES_SCHEMA, heartbeat.recordValue(), timestamp);
    return Collections.singletonList(record);
}
Also used : SourceRecord(org.apache.kafka.connect.source.SourceRecord)

Example 83 with SourceRecord

use of org.apache.kafka.connect.source.SourceRecord in project kafka by apache.

the class WorkerSourceTaskTest method testSendRecordsPropagatesTimestamp.

@Test
public void testSendRecordsPropagatesTimestamp() throws Exception {
    final Long timestamp = System.currentTimeMillis();
    createWorkerTask();
    List<SourceRecord> records = Collections.singletonList(new SourceRecord(PARTITION, OFFSET, "topic", null, KEY_SCHEMA, KEY, RECORD_SCHEMA, RECORD, timestamp));
    Capture<ProducerRecord<byte[], byte[]>> sent = expectSendRecordAnyTimes();
    expectTopicCreation(TOPIC);
    PowerMock.replayAll();
    Whitebox.setInternalState(workerTask, "toSend", records);
    Whitebox.invokeMethod(workerTask, "sendRecords");
    assertEquals(timestamp, sent.getValue().timestamp());
    PowerMock.verifyAll();
}
Also used : ProducerRecord(org.apache.kafka.clients.producer.ProducerRecord) SourceRecord(org.apache.kafka.connect.source.SourceRecord) ThreadedTest(org.apache.kafka.connect.util.ThreadedTest) RetryWithToleranceOperatorTest(org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperatorTest) ParameterizedTest(org.apache.kafka.connect.util.ParameterizedTest) Test(org.junit.Test)

Example 84 with SourceRecord

use of org.apache.kafka.connect.source.SourceRecord in project kafka by apache.

the class WorkerSourceTaskTest method testTopicCreateWhenTopicExists.

@Test
public void testTopicCreateWhenTopicExists() throws Exception {
    if (!enableTopicCreation)
        // should only test with topic creation enabled
        return;
    createWorkerTask();
    SourceRecord record1 = new SourceRecord(PARTITION, OFFSET, TOPIC, 1, KEY_SCHEMA, KEY, RECORD_SCHEMA, RECORD);
    SourceRecord record2 = new SourceRecord(PARTITION, OFFSET, TOPIC, 2, KEY_SCHEMA, KEY, RECORD_SCHEMA, RECORD);
    expectPreliminaryCalls();
    TopicPartitionInfo topicPartitionInfo = new TopicPartitionInfo(0, null, Collections.emptyList(), Collections.emptyList());
    TopicDescription topicDesc = new TopicDescription(TOPIC, false, Collections.singletonList(topicPartitionInfo));
    EasyMock.expect(admin.describeTopics(TOPIC)).andReturn(Collections.singletonMap(TOPIC, topicDesc));
    expectSendRecordTaskCommitRecordSucceed(false);
    expectSendRecordTaskCommitRecordSucceed(false);
    PowerMock.replayAll();
    Whitebox.setInternalState(workerTask, "toSend", Arrays.asList(record1, record2));
    Whitebox.invokeMethod(workerTask, "sendRecords");
}
Also used : TopicPartitionInfo(org.apache.kafka.common.TopicPartitionInfo) TopicDescription(org.apache.kafka.clients.admin.TopicDescription) SourceRecord(org.apache.kafka.connect.source.SourceRecord) ThreadedTest(org.apache.kafka.connect.util.ThreadedTest) RetryWithToleranceOperatorTest(org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperatorTest) ParameterizedTest(org.apache.kafka.connect.util.ParameterizedTest) Test(org.junit.Test)

Example 85 with SourceRecord

use of org.apache.kafka.connect.source.SourceRecord in project kafka by apache.

the class CastTest method castNullKeyRecordSchemaless.

@Test
public void castNullKeyRecordSchemaless() {
    xformKey.configure(Collections.singletonMap(Cast.SPEC_CONFIG, "foo:int64"));
    SourceRecord original = new SourceRecord(null, null, "topic", 0, null, null, Schema.STRING_SCHEMA, "value");
    SourceRecord transformed = xformKey.apply(original);
    assertEquals(original, transformed);
}
Also used : SourceRecord(org.apache.kafka.connect.source.SourceRecord) Test(org.junit.jupiter.api.Test)

Aggregations

SourceRecord (org.apache.kafka.connect.source.SourceRecord)308 Test (org.junit.Test)148 Test (org.junit.jupiter.api.Test)98 Struct (org.apache.kafka.connect.data.Struct)68 HashMap (java.util.HashMap)60 Schema (org.apache.kafka.connect.data.Schema)45 ThreadedTest (org.apache.kafka.connect.util.ThreadedTest)27 ParameterizedTest (org.apache.kafka.connect.util.ParameterizedTest)23 ArrayList (java.util.ArrayList)22 RetryWithToleranceOperatorTest (org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperatorTest)21 Map (java.util.Map)15 SchemaBuilder (org.apache.kafka.connect.data.SchemaBuilder)13 ConnectException (org.apache.kafka.connect.errors.ConnectException)13 Document (org.bson.Document)13 FixFor (io.debezium.doc.FixFor)12 List (java.util.List)12 RecordsForCollection (io.debezium.connector.mongodb.RecordMakers.RecordsForCollection)11 ProducerRecord (org.apache.kafka.clients.producer.ProducerRecord)11 ConnectHeaders (org.apache.kafka.connect.header.ConnectHeaders)11 BsonTimestamp (org.bson.BsonTimestamp)11