Search in sources :

Example 31 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class PushFilterIntoSourceScanRuleBase method resolveFiltersAndCreateTableSourceTable.

/**
 * Resolves filters using the underlying sources {@link SupportsFilterPushDown} and creates a
 * new {@link TableSourceTable} with the supplied predicates.
 *
 * @param convertiblePredicates Predicates to resolve
 * @param oldTableSourceTable TableSourceTable to copy
 * @param scan Underlying table scan to push to
 * @param relBuilder Builder to push the scan to
 * @return A tuple, constituting of the resolved filters and the newly created {@link
 *     TableSourceTable}
 */
protected Tuple2<SupportsFilterPushDown.Result, TableSourceTable> resolveFiltersAndCreateTableSourceTable(RexNode[] convertiblePredicates, TableSourceTable oldTableSourceTable, TableScan scan, RelBuilder relBuilder) {
    // record size before applyFilters for update statistics
    int originPredicatesSize = convertiblePredicates.length;
    // update DynamicTableSource
    DynamicTableSource newTableSource = oldTableSourceTable.tableSource().copy();
    SupportsFilterPushDown.Result result = FilterPushDownSpec.apply(Arrays.asList(convertiblePredicates), newTableSource, SourceAbilityContext.from(scan));
    relBuilder.push(scan);
    List<RexNode> acceptedPredicates = convertExpressionToRexNode(result.getAcceptedFilters(), relBuilder);
    FilterPushDownSpec filterPushDownSpec = new FilterPushDownSpec(acceptedPredicates);
    // record size after applyFilters for update statistics
    int updatedPredicatesSize = result.getRemainingFilters().size();
    // set the newStatistic newTableSource and sourceAbilitySpecs
    TableSourceTable newTableSourceTable = oldTableSourceTable.copy(newTableSource, getNewFlinkStatistic(oldTableSourceTable, originPredicatesSize, updatedPredicatesSize), new SourceAbilitySpec[] { filterPushDownSpec });
    return new Tuple2<>(result, newTableSourceTable);
}
Also used : FilterPushDownSpec(org.apache.flink.table.planner.plan.abilities.source.FilterPushDownSpec) Tuple2(scala.Tuple2) SupportsFilterPushDown(org.apache.flink.table.connector.source.abilities.SupportsFilterPushDown) TableSourceTable(org.apache.flink.table.planner.plan.schema.TableSourceTable) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) RexNode(org.apache.calcite.rex.RexNode)

Example 32 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class PushLimitIntoTableSourceScanRule method applyLimit.

private TableSourceTable applyLimit(long limit, FlinkLogicalTableSourceScan scan) {
    TableSourceTable relOptTable = scan.getTable().unwrap(TableSourceTable.class);
    TableSourceTable oldTableSourceTable = relOptTable.unwrap(TableSourceTable.class);
    DynamicTableSource newTableSource = oldTableSourceTable.tableSource().copy();
    LimitPushDownSpec limitPushDownSpec = new LimitPushDownSpec(limit);
    limitPushDownSpec.apply(newTableSource, SourceAbilityContext.from(scan));
    FlinkStatistic statistic = relOptTable.getStatistic();
    final long newRowCount;
    if (statistic.getRowCount() != null) {
        newRowCount = Math.min(limit, statistic.getRowCount().longValue());
    } else {
        newRowCount = limit;
    }
    // update TableStats after limit push down
    TableStats newTableStats = new TableStats(newRowCount);
    FlinkStatistic newStatistic = FlinkStatistic.builder().statistic(statistic).tableStats(newTableStats).build();
    return oldTableSourceTable.copy(newTableSource, newStatistic, new SourceAbilitySpec[] { limitPushDownSpec });
}
Also used : LimitPushDownSpec(org.apache.flink.table.planner.plan.abilities.source.LimitPushDownSpec) FlinkStatistic(org.apache.flink.table.planner.plan.stats.FlinkStatistic) TableSourceTable(org.apache.flink.table.planner.plan.schema.TableSourceTable) TableStats(org.apache.flink.table.plan.stats.TableStats) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource)

Example 33 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class RawFormatFactoryTest method createDeserializationSchema.

// ------------------------------------------------------------------------
// Utilities
// ------------------------------------------------------------------------
private static DeserializationSchema<RowData> createDeserializationSchema(ResolvedSchema schema, Map<String, String> options) {
    final DynamicTableSource actualSource = createTableSource(schema, options);
    assertThat(actualSource, instanceOf(TestDynamicTableFactory.DynamicTableSourceMock.class));
    TestDynamicTableFactory.DynamicTableSourceMock scanSourceMock = (TestDynamicTableFactory.DynamicTableSourceMock) actualSource;
    return scanSourceMock.valueFormat.createRuntimeDecoder(ScanRuntimeProviderContext.INSTANCE, schema.toPhysicalRowDataType());
}
Also used : TestDynamicTableFactory(org.apache.flink.table.factories.TestDynamicTableFactory) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource)

Example 34 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class FileSystemTableFactoryTest method testSupportsMetadata.

@Test
public void testSupportsMetadata() {
    Map<String, String> descriptor = new HashMap<>();
    descriptor.put(FactoryUtil.CONNECTOR.key(), "filesystem");
    descriptor.put("path", "/tmp");
    descriptor.put("format", "testcsv");
    descriptor.put("testcsv.my_option", "my_value");
    DynamicTableSource source = createTableSource(SCHEMA, descriptor);
    assertTrue(source instanceof FileSystemTableSource);
    Map<String, DataType> readableMetadata = new HashMap<>();
    readableMetadata.put("file.path", DataTypes.STRING().notNull());
    readableMetadata.put("file.name", DataTypes.STRING().notNull());
    readableMetadata.put("file.size", DataTypes.BIGINT().notNull());
    readableMetadata.put("file.modification-time", DataTypes.TIMESTAMP_LTZ(3).notNull());
    assertEquals(readableMetadata, ((FileSystemTableSource) source).listReadableMetadata());
}
Also used : HashMap(java.util.HashMap) DataType(org.apache.flink.table.types.DataType) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) Test(org.junit.Test)

Example 35 with DynamicTableSource

use of org.apache.flink.table.connector.source.DynamicTableSource in project flink by apache.

the class AvroFormatFactoryTest method testSeDeSchema.

@Test
public void testSeDeSchema() {
    final AvroRowDataDeserializationSchema expectedDeser = new AvroRowDataDeserializationSchema(ROW_TYPE, InternalTypeInfo.of(ROW_TYPE));
    final Map<String, String> options = getAllOptions();
    final DynamicTableSource actualSource = FactoryMocks.createTableSource(SCHEMA, options);
    assert actualSource instanceof TestDynamicTableFactory.DynamicTableSourceMock;
    TestDynamicTableFactory.DynamicTableSourceMock scanSourceMock = (TestDynamicTableFactory.DynamicTableSourceMock) actualSource;
    DeserializationSchema<RowData> actualDeser = scanSourceMock.valueFormat.createRuntimeDecoder(ScanRuntimeProviderContext.INSTANCE, SCHEMA.toPhysicalRowDataType());
    assertEquals(expectedDeser, actualDeser);
    final AvroRowDataSerializationSchema expectedSer = new AvroRowDataSerializationSchema(ROW_TYPE);
    final DynamicTableSink actualSink = FactoryMocks.createTableSink(SCHEMA, options);
    assert actualSink instanceof TestDynamicTableFactory.DynamicTableSinkMock;
    TestDynamicTableFactory.DynamicTableSinkMock sinkMock = (TestDynamicTableFactory.DynamicTableSinkMock) actualSink;
    SerializationSchema<RowData> actualSer = sinkMock.valueFormat.createRuntimeEncoder(null, SCHEMA.toPhysicalRowDataType());
    assertEquals(expectedSer, actualSer);
}
Also used : DynamicTableSink(org.apache.flink.table.connector.sink.DynamicTableSink) TestDynamicTableFactory(org.apache.flink.table.factories.TestDynamicTableFactory) RowData(org.apache.flink.table.data.RowData) DynamicTableSource(org.apache.flink.table.connector.source.DynamicTableSource) Test(org.junit.Test)

Aggregations

DynamicTableSource (org.apache.flink.table.connector.source.DynamicTableSource)55 Test (org.junit.Test)24 DynamicTableSink (org.apache.flink.table.connector.sink.DynamicTableSink)12 TestDynamicTableFactory (org.apache.flink.table.factories.TestDynamicTableFactory)12 Test (org.junit.jupiter.api.Test)10 RowData (org.apache.flink.table.data.RowData)9 DecodingFormatMock (org.apache.flink.table.factories.TestFormatFactory.DecodingFormatMock)8 TableSourceTable (org.apache.flink.table.planner.plan.schema.TableSourceTable)8 ResolvedSchema (org.apache.flink.table.catalog.ResolvedSchema)7 HashMap (java.util.HashMap)5 Configuration (org.apache.flink.configuration.Configuration)5 ScanTableSource (org.apache.flink.table.connector.source.ScanTableSource)5 ParameterizedTest (org.junit.jupiter.params.ParameterizedTest)5 ArrayList (java.util.ArrayList)4 JdbcConnectorOptions (org.apache.flink.connector.jdbc.internal.options.JdbcConnectorOptions)4 JdbcLookupOptions (org.apache.flink.connector.jdbc.internal.options.JdbcLookupOptions)4 CatalogTable (org.apache.flink.table.catalog.CatalogTable)4 SourceAbilitySpec (org.apache.flink.table.planner.plan.abilities.source.SourceAbilitySpec)4 List (java.util.List)3 LogicalTableScan (org.apache.calcite.rel.logical.LogicalTableScan)3