Search in sources :

Example 26 with FileSourceSplit

use of org.apache.flink.connector.file.src.FileSourceSplit in project flink by apache.

the class AdapterTestBase method buildSplits.

static Queue<FileSourceSplit> buildSplits(int numSplits) {
    final Queue<FileSourceSplit> splits = new ArrayDeque<>();
    final long rangeForSplit = FILE_LEN / numSplits;
    for (int i = 0; i < numSplits - 1; i++) {
        splits.add(new FileSourceSplit("ID-" + i, testPath, i * rangeForSplit, rangeForSplit, 0, FILE_LEN));
    }
    final long startOfLast = (numSplits - 1) * rangeForSplit;
    splits.add(new FileSourceSplit("ID-" + (numSplits - 1), testPath, startOfLast, FILE_LEN - startOfLast, 0, FILE_LEN));
    return splits;
}
Also used : FileSourceSplit(org.apache.flink.connector.file.src.FileSourceSplit) ArrayDeque(java.util.ArrayDeque)

Example 27 with FileSourceSplit

use of org.apache.flink.connector.file.src.FileSourceSplit in project flink by apache.

the class FileSourceReaderTest method testRequestSplitWhenNoSplitRestored.

@Test
public void testRequestSplitWhenNoSplitRestored() throws Exception {
    final TestingReaderContext context = new TestingReaderContext();
    final FileSourceReader<String, FileSourceSplit> reader = createReader(context);
    reader.start();
    reader.close();
    assertEquals(1, context.getNumSplitRequests());
}
Also used : FileSourceSplit(org.apache.flink.connector.file.src.FileSourceSplit) TestingReaderContext(org.apache.flink.connector.testutils.source.reader.TestingReaderContext) Test(org.junit.Test)

Example 28 with FileSourceSplit

use of org.apache.flink.connector.file.src.FileSourceSplit in project flink by apache.

the class FileSourceReaderTest method testNoSplitRequestWhenSplitRestored.

@Test
public void testNoSplitRequestWhenSplitRestored() throws Exception {
    final TestingReaderContext context = new TestingReaderContext();
    final FileSourceReader<String, FileSourceSplit> reader = createReader(context);
    reader.addSplits(Collections.singletonList(createTestFileSplit()));
    reader.start();
    reader.close();
    assertEquals(0, context.getNumSplitRequests());
}
Also used : FileSourceSplit(org.apache.flink.connector.file.src.FileSourceSplit) TestingReaderContext(org.apache.flink.connector.testutils.source.reader.TestingReaderContext) Test(org.junit.Test)

Example 29 with FileSourceSplit

use of org.apache.flink.connector.file.src.FileSourceSplit in project flink by apache.

the class StreamFormatAdapterTest method simpleReadTest.

private void simpleReadTest(int batchSize) throws IOException {
    final Configuration config = new Configuration();
    config.set(StreamFormat.FETCH_IO_SIZE, new MemorySize(batchSize));
    final StreamFormatAdapter<Integer> format = new StreamFormatAdapter<>(new CheckpointedIntFormat());
    final BulkFormat.Reader<Integer> reader = format.createReader(config, new FileSourceSplit("test-id", testPath, 0L, FILE_LEN, 0L, FILE_LEN));
    final List<Integer> result = new ArrayList<>();
    readNumbers(reader, result, NUM_NUMBERS);
    verifyIntListResult(result);
}
Also used : MemorySize(org.apache.flink.configuration.MemorySize) Configuration(org.apache.flink.configuration.Configuration) FileSourceSplit(org.apache.flink.connector.file.src.FileSourceSplit) ArrayList(java.util.ArrayList) BulkFormat(org.apache.flink.connector.file.src.reader.BulkFormat)

Example 30 with FileSourceSplit

use of org.apache.flink.connector.file.src.FileSourceSplit in project flink by apache.

the class LimitableBulkFormatTest method testSwallowExceptionWhenLimited.

@Test
public void testSwallowExceptionWhenLimited() throws IOException {
    long limit = 1000L;
    LimitableBulkFormat<String, FileSourceSplit> format = (LimitableBulkFormat<String, FileSourceSplit>) LimitableBulkFormat.create(new StreamFormatAdapter<>(new FailedFormat()), limit);
    BulkFormat.Reader<String> reader = format.createReader(new Configuration(), new FileSourceSplit("id", new Path(file.toURI()), 0, file.length()));
    format.globalNumberRead().set(limit + 1);
    // should swallow exception
    reader.readBatch();
}
Also used : Path(org.apache.flink.core.fs.Path) FileSourceSplit(org.apache.flink.connector.file.src.FileSourceSplit) Configuration(org.apache.flink.configuration.Configuration) StreamFormatAdapter(org.apache.flink.connector.file.src.impl.StreamFormatAdapter) BulkFormat(org.apache.flink.connector.file.src.reader.BulkFormat) Test(org.junit.Test)

Aggregations

FileSourceSplit (org.apache.flink.connector.file.src.FileSourceSplit)50 Test (org.junit.Test)32 Path (org.apache.flink.core.fs.Path)20 AtomicInteger (java.util.concurrent.atomic.AtomicInteger)11 BulkFormat (org.apache.flink.connector.file.src.reader.BulkFormat)11 Configuration (org.apache.flink.configuration.Configuration)10 ArrayList (java.util.ArrayList)9 TestingSplitEnumeratorContext (org.apache.flink.connector.testutils.source.reader.TestingSplitEnumeratorContext)7 IOException (java.io.IOException)6 RowData (org.apache.flink.table.data.RowData)6 LogicalType (org.apache.flink.table.types.logical.LogicalType)6 LinkedHashMap (java.util.LinkedHashMap)5 TestingFileSystem (org.apache.flink.connector.file.src.testutils.TestingFileSystem)5 FileStatus (org.apache.flink.core.fs.FileStatus)5 AtomicLong (java.util.concurrent.atomic.AtomicLong)4 BigIntType (org.apache.flink.table.types.logical.BigIntType)4 DoubleType (org.apache.flink.table.types.logical.DoubleType)4 IntType (org.apache.flink.table.types.logical.IntType)4 SmallIntType (org.apache.flink.table.types.logical.SmallIntType)4 TinyIntType (org.apache.flink.table.types.logical.TinyIntType)4